Dec  6 04:00:28 np0005548918 kernel: Linux version 5.14.0-645.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025
Dec  6 04:00:28 np0005548918 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec  6 04:00:28 np0005548918 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  6 04:00:28 np0005548918 kernel: BIOS-provided physical RAM map:
Dec  6 04:00:28 np0005548918 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec  6 04:00:28 np0005548918 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec  6 04:00:28 np0005548918 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec  6 04:00:28 np0005548918 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec  6 04:00:28 np0005548918 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec  6 04:00:28 np0005548918 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec  6 04:00:28 np0005548918 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec  6 04:00:28 np0005548918 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec  6 04:00:28 np0005548918 kernel: NX (Execute Disable) protection: active
Dec  6 04:00:28 np0005548918 kernel: APIC: Static calls initialized
Dec  6 04:00:28 np0005548918 kernel: SMBIOS 2.8 present.
Dec  6 04:00:28 np0005548918 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec  6 04:00:28 np0005548918 kernel: Hypervisor detected: KVM
Dec  6 04:00:28 np0005548918 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec  6 04:00:28 np0005548918 kernel: kvm-clock: using sched offset of 3234242165 cycles
Dec  6 04:00:28 np0005548918 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec  6 04:00:28 np0005548918 kernel: tsc: Detected 2799.998 MHz processor
Dec  6 04:00:28 np0005548918 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec  6 04:00:28 np0005548918 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec  6 04:00:28 np0005548918 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec  6 04:00:28 np0005548918 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec  6 04:00:28 np0005548918 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec  6 04:00:28 np0005548918 kernel: Using GB pages for direct mapping
Dec  6 04:00:28 np0005548918 kernel: RAMDISK: [mem 0x2d472000-0x32a30fff]
Dec  6 04:00:28 np0005548918 kernel: ACPI: Early table checksum verification disabled
Dec  6 04:00:28 np0005548918 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec  6 04:00:28 np0005548918 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  6 04:00:28 np0005548918 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  6 04:00:28 np0005548918 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  6 04:00:28 np0005548918 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec  6 04:00:28 np0005548918 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  6 04:00:28 np0005548918 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  6 04:00:28 np0005548918 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec  6 04:00:28 np0005548918 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec  6 04:00:28 np0005548918 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec  6 04:00:28 np0005548918 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec  6 04:00:28 np0005548918 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec  6 04:00:28 np0005548918 kernel: No NUMA configuration found
Dec  6 04:00:28 np0005548918 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec  6 04:00:28 np0005548918 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Dec  6 04:00:28 np0005548918 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec  6 04:00:28 np0005548918 kernel: Zone ranges:
Dec  6 04:00:28 np0005548918 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec  6 04:00:28 np0005548918 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec  6 04:00:28 np0005548918 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec  6 04:00:28 np0005548918 kernel:  Device   empty
Dec  6 04:00:28 np0005548918 kernel: Movable zone start for each node
Dec  6 04:00:28 np0005548918 kernel: Early memory node ranges
Dec  6 04:00:28 np0005548918 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec  6 04:00:28 np0005548918 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec  6 04:00:28 np0005548918 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec  6 04:00:28 np0005548918 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec  6 04:00:28 np0005548918 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec  6 04:00:28 np0005548918 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec  6 04:00:28 np0005548918 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec  6 04:00:28 np0005548918 kernel: ACPI: PM-Timer IO Port: 0x608
Dec  6 04:00:28 np0005548918 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec  6 04:00:28 np0005548918 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec  6 04:00:28 np0005548918 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec  6 04:00:28 np0005548918 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec  6 04:00:28 np0005548918 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec  6 04:00:28 np0005548918 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec  6 04:00:28 np0005548918 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec  6 04:00:28 np0005548918 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec  6 04:00:28 np0005548918 kernel: TSC deadline timer available
Dec  6 04:00:28 np0005548918 kernel: CPU topo: Max. logical packages:   8
Dec  6 04:00:28 np0005548918 kernel: CPU topo: Max. logical dies:       8
Dec  6 04:00:28 np0005548918 kernel: CPU topo: Max. dies per package:   1
Dec  6 04:00:28 np0005548918 kernel: CPU topo: Max. threads per core:   1
Dec  6 04:00:28 np0005548918 kernel: CPU topo: Num. cores per package:     1
Dec  6 04:00:28 np0005548918 kernel: CPU topo: Num. threads per package:   1
Dec  6 04:00:28 np0005548918 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec  6 04:00:28 np0005548918 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec  6 04:00:28 np0005548918 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec  6 04:00:28 np0005548918 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec  6 04:00:28 np0005548918 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec  6 04:00:28 np0005548918 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec  6 04:00:28 np0005548918 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec  6 04:00:28 np0005548918 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec  6 04:00:28 np0005548918 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec  6 04:00:28 np0005548918 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec  6 04:00:28 np0005548918 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec  6 04:00:28 np0005548918 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec  6 04:00:28 np0005548918 kernel: Booting paravirtualized kernel on KVM
Dec  6 04:00:28 np0005548918 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec  6 04:00:28 np0005548918 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec  6 04:00:28 np0005548918 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec  6 04:00:28 np0005548918 kernel: kvm-guest: PV spinlocks disabled, no host support
Dec  6 04:00:28 np0005548918 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  6 04:00:28 np0005548918 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64", will be passed to user space.
Dec  6 04:00:28 np0005548918 kernel: random: crng init done
Dec  6 04:00:28 np0005548918 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec  6 04:00:28 np0005548918 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec  6 04:00:28 np0005548918 kernel: Fallback order for Node 0: 0 
Dec  6 04:00:28 np0005548918 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec  6 04:00:28 np0005548918 kernel: Policy zone: Normal
Dec  6 04:00:28 np0005548918 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec  6 04:00:28 np0005548918 kernel: software IO TLB: area num 8.
Dec  6 04:00:28 np0005548918 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec  6 04:00:28 np0005548918 kernel: ftrace: allocating 49335 entries in 193 pages
Dec  6 04:00:28 np0005548918 kernel: ftrace: allocated 193 pages with 3 groups
Dec  6 04:00:28 np0005548918 kernel: Dynamic Preempt: voluntary
Dec  6 04:00:28 np0005548918 kernel: rcu: Preemptible hierarchical RCU implementation.
Dec  6 04:00:28 np0005548918 kernel: rcu: #011RCU event tracing is enabled.
Dec  6 04:00:28 np0005548918 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec  6 04:00:28 np0005548918 kernel: #011Trampoline variant of Tasks RCU enabled.
Dec  6 04:00:28 np0005548918 kernel: #011Rude variant of Tasks RCU enabled.
Dec  6 04:00:28 np0005548918 kernel: #011Tracing variant of Tasks RCU enabled.
Dec  6 04:00:28 np0005548918 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec  6 04:00:28 np0005548918 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec  6 04:00:28 np0005548918 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  6 04:00:28 np0005548918 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  6 04:00:28 np0005548918 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  6 04:00:28 np0005548918 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec  6 04:00:28 np0005548918 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec  6 04:00:28 np0005548918 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec  6 04:00:28 np0005548918 kernel: Console: colour VGA+ 80x25
Dec  6 04:00:28 np0005548918 kernel: printk: console [ttyS0] enabled
Dec  6 04:00:28 np0005548918 kernel: ACPI: Core revision 20230331
Dec  6 04:00:28 np0005548918 kernel: APIC: Switch to symmetric I/O mode setup
Dec  6 04:00:28 np0005548918 kernel: x2apic enabled
Dec  6 04:00:28 np0005548918 kernel: APIC: Switched APIC routing to: physical x2apic
Dec  6 04:00:28 np0005548918 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec  6 04:00:28 np0005548918 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Dec  6 04:00:28 np0005548918 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec  6 04:00:28 np0005548918 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec  6 04:00:28 np0005548918 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec  6 04:00:28 np0005548918 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec  6 04:00:28 np0005548918 kernel: Spectre V2 : Mitigation: Retpolines
Dec  6 04:00:28 np0005548918 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec  6 04:00:28 np0005548918 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec  6 04:00:28 np0005548918 kernel: RETBleed: Mitigation: untrained return thunk
Dec  6 04:00:28 np0005548918 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec  6 04:00:28 np0005548918 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec  6 04:00:28 np0005548918 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec  6 04:00:28 np0005548918 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec  6 04:00:28 np0005548918 kernel: x86/bugs: return thunk changed
Dec  6 04:00:28 np0005548918 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec  6 04:00:28 np0005548918 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec  6 04:00:28 np0005548918 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec  6 04:00:28 np0005548918 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec  6 04:00:28 np0005548918 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec  6 04:00:28 np0005548918 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec  6 04:00:28 np0005548918 kernel: Freeing SMP alternatives memory: 40K
Dec  6 04:00:28 np0005548918 kernel: pid_max: default: 32768 minimum: 301
Dec  6 04:00:28 np0005548918 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec  6 04:00:28 np0005548918 kernel: landlock: Up and running.
Dec  6 04:00:28 np0005548918 kernel: Yama: becoming mindful.
Dec  6 04:00:28 np0005548918 kernel: SELinux:  Initializing.
Dec  6 04:00:28 np0005548918 kernel: LSM support for eBPF active
Dec  6 04:00:28 np0005548918 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  6 04:00:28 np0005548918 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  6 04:00:28 np0005548918 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec  6 04:00:28 np0005548918 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec  6 04:00:28 np0005548918 kernel: ... version:                0
Dec  6 04:00:28 np0005548918 kernel: ... bit width:              48
Dec  6 04:00:28 np0005548918 kernel: ... generic registers:      6
Dec  6 04:00:28 np0005548918 kernel: ... value mask:             0000ffffffffffff
Dec  6 04:00:28 np0005548918 kernel: ... max period:             00007fffffffffff
Dec  6 04:00:28 np0005548918 kernel: ... fixed-purpose events:   0
Dec  6 04:00:28 np0005548918 kernel: ... event mask:             000000000000003f
Dec  6 04:00:28 np0005548918 kernel: signal: max sigframe size: 1776
Dec  6 04:00:28 np0005548918 kernel: rcu: Hierarchical SRCU implementation.
Dec  6 04:00:28 np0005548918 kernel: rcu: #011Max phase no-delay instances is 400.
Dec  6 04:00:28 np0005548918 kernel: smp: Bringing up secondary CPUs ...
Dec  6 04:00:28 np0005548918 kernel: smpboot: x86: Booting SMP configuration:
Dec  6 04:00:28 np0005548918 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec  6 04:00:28 np0005548918 kernel: smp: Brought up 1 node, 8 CPUs
Dec  6 04:00:28 np0005548918 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Dec  6 04:00:28 np0005548918 kernel: node 0 deferred pages initialised in 9ms
Dec  6 04:00:28 np0005548918 kernel: Memory: 7763736K/8388068K available (16384K kernel code, 5795K rwdata, 13908K rodata, 4196K init, 7156K bss, 618208K reserved, 0K cma-reserved)
Dec  6 04:00:28 np0005548918 kernel: devtmpfs: initialized
Dec  6 04:00:28 np0005548918 kernel: x86/mm: Memory block size: 128MB
Dec  6 04:00:28 np0005548918 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec  6 04:00:28 np0005548918 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec  6 04:00:28 np0005548918 kernel: pinctrl core: initialized pinctrl subsystem
Dec  6 04:00:28 np0005548918 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec  6 04:00:28 np0005548918 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec  6 04:00:28 np0005548918 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec  6 04:00:28 np0005548918 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec  6 04:00:28 np0005548918 kernel: audit: initializing netlink subsys (disabled)
Dec  6 04:00:28 np0005548918 kernel: audit: type=2000 audit(1765011627.215:1): state=initialized audit_enabled=0 res=1
Dec  6 04:00:28 np0005548918 kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec  6 04:00:28 np0005548918 kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec  6 04:00:28 np0005548918 kernel: thermal_sys: Registered thermal governor 'user_space'
Dec  6 04:00:28 np0005548918 kernel: cpuidle: using governor menu
Dec  6 04:00:28 np0005548918 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec  6 04:00:28 np0005548918 kernel: PCI: Using configuration type 1 for base access
Dec  6 04:00:28 np0005548918 kernel: PCI: Using configuration type 1 for extended access
Dec  6 04:00:28 np0005548918 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec  6 04:00:28 np0005548918 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec  6 04:00:28 np0005548918 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec  6 04:00:28 np0005548918 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec  6 04:00:28 np0005548918 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec  6 04:00:28 np0005548918 kernel: Demotion targets for Node 0: null
Dec  6 04:00:28 np0005548918 kernel: cryptd: max_cpu_qlen set to 1000
Dec  6 04:00:28 np0005548918 kernel: ACPI: Added _OSI(Module Device)
Dec  6 04:00:28 np0005548918 kernel: ACPI: Added _OSI(Processor Device)
Dec  6 04:00:28 np0005548918 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec  6 04:00:28 np0005548918 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec  6 04:00:28 np0005548918 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec  6 04:00:28 np0005548918 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec  6 04:00:28 np0005548918 kernel: ACPI: Interpreter enabled
Dec  6 04:00:28 np0005548918 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec  6 04:00:28 np0005548918 kernel: ACPI: Using IOAPIC for interrupt routing
Dec  6 04:00:28 np0005548918 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec  6 04:00:28 np0005548918 kernel: PCI: Using E820 reservations for host bridge windows
Dec  6 04:00:28 np0005548918 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec  6 04:00:28 np0005548918 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec  6 04:00:28 np0005548918 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [3] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [4] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [5] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [6] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [7] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [8] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [9] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [10] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [11] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [12] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [13] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [14] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [15] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [16] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [17] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [18] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [19] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [20] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [21] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [22] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [23] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [24] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [25] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [26] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [27] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [28] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [29] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [30] registered
Dec  6 04:00:28 np0005548918 kernel: acpiphp: Slot [31] registered
Dec  6 04:00:28 np0005548918 kernel: PCI host bridge to bus 0000:00
Dec  6 04:00:28 np0005548918 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec  6 04:00:28 np0005548918 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec  6 04:00:28 np0005548918 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec  6 04:00:28 np0005548918 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec  6 04:00:28 np0005548918 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec  6 04:00:28 np0005548918 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec  6 04:00:28 np0005548918 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec  6 04:00:28 np0005548918 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec  6 04:00:28 np0005548918 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec  6 04:00:28 np0005548918 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec  6 04:00:28 np0005548918 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec  6 04:00:28 np0005548918 kernel: iommu: Default domain type: Translated
Dec  6 04:00:28 np0005548918 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec  6 04:00:28 np0005548918 kernel: SCSI subsystem initialized
Dec  6 04:00:28 np0005548918 kernel: ACPI: bus type USB registered
Dec  6 04:00:28 np0005548918 kernel: usbcore: registered new interface driver usbfs
Dec  6 04:00:28 np0005548918 kernel: usbcore: registered new interface driver hub
Dec  6 04:00:28 np0005548918 kernel: usbcore: registered new device driver usb
Dec  6 04:00:28 np0005548918 kernel: pps_core: LinuxPPS API ver. 1 registered
Dec  6 04:00:28 np0005548918 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec  6 04:00:28 np0005548918 kernel: PTP clock support registered
Dec  6 04:00:28 np0005548918 kernel: EDAC MC: Ver: 3.0.0
Dec  6 04:00:28 np0005548918 kernel: NetLabel: Initializing
Dec  6 04:00:28 np0005548918 kernel: NetLabel:  domain hash size = 128
Dec  6 04:00:28 np0005548918 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec  6 04:00:28 np0005548918 kernel: NetLabel:  unlabeled traffic allowed by default
Dec  6 04:00:28 np0005548918 kernel: PCI: Using ACPI for IRQ routing
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec  6 04:00:28 np0005548918 kernel: vgaarb: loaded
Dec  6 04:00:28 np0005548918 kernel: clocksource: Switched to clocksource kvm-clock
Dec  6 04:00:28 np0005548918 kernel: VFS: Disk quotas dquot_6.6.0
Dec  6 04:00:28 np0005548918 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec  6 04:00:28 np0005548918 kernel: pnp: PnP ACPI init
Dec  6 04:00:28 np0005548918 kernel: pnp: PnP ACPI: found 5 devices
Dec  6 04:00:28 np0005548918 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec  6 04:00:28 np0005548918 kernel: NET: Registered PF_INET protocol family
Dec  6 04:00:28 np0005548918 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec  6 04:00:28 np0005548918 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec  6 04:00:28 np0005548918 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec  6 04:00:28 np0005548918 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec  6 04:00:28 np0005548918 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec  6 04:00:28 np0005548918 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec  6 04:00:28 np0005548918 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec  6 04:00:28 np0005548918 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  6 04:00:28 np0005548918 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  6 04:00:28 np0005548918 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec  6 04:00:28 np0005548918 kernel: NET: Registered PF_XDP protocol family
Dec  6 04:00:28 np0005548918 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec  6 04:00:28 np0005548918 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec  6 04:00:28 np0005548918 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec  6 04:00:28 np0005548918 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec  6 04:00:28 np0005548918 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec  6 04:00:28 np0005548918 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec  6 04:00:28 np0005548918 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 81090 usecs
Dec  6 04:00:28 np0005548918 kernel: PCI: CLS 0 bytes, default 64
Dec  6 04:00:28 np0005548918 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec  6 04:00:28 np0005548918 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec  6 04:00:28 np0005548918 kernel: ACPI: bus type thunderbolt registered
Dec  6 04:00:28 np0005548918 kernel: Trying to unpack rootfs image as initramfs...
Dec  6 04:00:28 np0005548918 kernel: Initialise system trusted keyrings
Dec  6 04:00:28 np0005548918 kernel: Key type blacklist registered
Dec  6 04:00:28 np0005548918 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec  6 04:00:28 np0005548918 kernel: zbud: loaded
Dec  6 04:00:28 np0005548918 kernel: integrity: Platform Keyring initialized
Dec  6 04:00:28 np0005548918 kernel: integrity: Machine keyring initialized
Dec  6 04:00:28 np0005548918 kernel: Freeing initrd memory: 87804K
Dec  6 04:00:28 np0005548918 kernel: NET: Registered PF_ALG protocol family
Dec  6 04:00:28 np0005548918 kernel: xor: automatically using best checksumming function   avx       
Dec  6 04:00:28 np0005548918 kernel: Key type asymmetric registered
Dec  6 04:00:28 np0005548918 kernel: Asymmetric key parser 'x509' registered
Dec  6 04:00:28 np0005548918 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec  6 04:00:28 np0005548918 kernel: io scheduler mq-deadline registered
Dec  6 04:00:28 np0005548918 kernel: io scheduler kyber registered
Dec  6 04:00:28 np0005548918 kernel: io scheduler bfq registered
Dec  6 04:00:28 np0005548918 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec  6 04:00:28 np0005548918 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec  6 04:00:28 np0005548918 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec  6 04:00:28 np0005548918 kernel: ACPI: button: Power Button [PWRF]
Dec  6 04:00:28 np0005548918 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec  6 04:00:28 np0005548918 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec  6 04:00:28 np0005548918 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec  6 04:00:28 np0005548918 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec  6 04:00:28 np0005548918 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec  6 04:00:28 np0005548918 kernel: Non-volatile memory driver v1.3
Dec  6 04:00:28 np0005548918 kernel: rdac: device handler registered
Dec  6 04:00:28 np0005548918 kernel: hp_sw: device handler registered
Dec  6 04:00:28 np0005548918 kernel: emc: device handler registered
Dec  6 04:00:28 np0005548918 kernel: alua: device handler registered
Dec  6 04:00:28 np0005548918 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec  6 04:00:28 np0005548918 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec  6 04:00:28 np0005548918 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec  6 04:00:28 np0005548918 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec  6 04:00:28 np0005548918 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec  6 04:00:28 np0005548918 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec  6 04:00:28 np0005548918 kernel: usb usb1: Product: UHCI Host Controller
Dec  6 04:00:28 np0005548918 kernel: usb usb1: Manufacturer: Linux 5.14.0-645.el9.x86_64 uhci_hcd
Dec  6 04:00:28 np0005548918 kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec  6 04:00:28 np0005548918 kernel: hub 1-0:1.0: USB hub found
Dec  6 04:00:28 np0005548918 kernel: hub 1-0:1.0: 2 ports detected
Dec  6 04:00:28 np0005548918 kernel: usbcore: registered new interface driver usbserial_generic
Dec  6 04:00:28 np0005548918 kernel: usbserial: USB Serial support registered for generic
Dec  6 04:00:28 np0005548918 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec  6 04:00:28 np0005548918 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec  6 04:00:28 np0005548918 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec  6 04:00:28 np0005548918 kernel: mousedev: PS/2 mouse device common for all mice
Dec  6 04:00:28 np0005548918 kernel: rtc_cmos 00:04: RTC can wake from S4
Dec  6 04:00:28 np0005548918 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec  6 04:00:28 np0005548918 kernel: rtc_cmos 00:04: registered as rtc0
Dec  6 04:00:28 np0005548918 kernel: rtc_cmos 00:04: setting system clock to 2025-12-06T09:00:27 UTC (1765011627)
Dec  6 04:00:28 np0005548918 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec  6 04:00:28 np0005548918 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec  6 04:00:28 np0005548918 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec  6 04:00:28 np0005548918 kernel: hid: raw HID events driver (C) Jiri Kosina
Dec  6 04:00:28 np0005548918 kernel: usbcore: registered new interface driver usbhid
Dec  6 04:00:28 np0005548918 kernel: usbhid: USB HID core driver
Dec  6 04:00:28 np0005548918 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec  6 04:00:28 np0005548918 kernel: drop_monitor: Initializing network drop monitor service
Dec  6 04:00:28 np0005548918 kernel: Initializing XFRM netlink socket
Dec  6 04:00:28 np0005548918 kernel: NET: Registered PF_INET6 protocol family
Dec  6 04:00:28 np0005548918 kernel: Segment Routing with IPv6
Dec  6 04:00:28 np0005548918 kernel: NET: Registered PF_PACKET protocol family
Dec  6 04:00:28 np0005548918 kernel: mpls_gso: MPLS GSO support
Dec  6 04:00:28 np0005548918 kernel: IPI shorthand broadcast: enabled
Dec  6 04:00:28 np0005548918 kernel: AVX2 version of gcm_enc/dec engaged.
Dec  6 04:00:28 np0005548918 kernel: AES CTR mode by8 optimization enabled
Dec  6 04:00:28 np0005548918 kernel: sched_clock: Marking stable (1196003846, 150834454)->(1466722754, -119884454)
Dec  6 04:00:28 np0005548918 kernel: registered taskstats version 1
Dec  6 04:00:28 np0005548918 kernel: Loading compiled-in X.509 certificates
Dec  6 04:00:28 np0005548918 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec  6 04:00:28 np0005548918 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec  6 04:00:28 np0005548918 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec  6 04:00:28 np0005548918 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec  6 04:00:28 np0005548918 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec  6 04:00:28 np0005548918 kernel: Demotion targets for Node 0: null
Dec  6 04:00:28 np0005548918 kernel: page_owner is disabled
Dec  6 04:00:28 np0005548918 kernel: Key type .fscrypt registered
Dec  6 04:00:28 np0005548918 kernel: Key type fscrypt-provisioning registered
Dec  6 04:00:28 np0005548918 kernel: Key type big_key registered
Dec  6 04:00:28 np0005548918 kernel: Key type encrypted registered
Dec  6 04:00:28 np0005548918 kernel: ima: No TPM chip found, activating TPM-bypass!
Dec  6 04:00:28 np0005548918 kernel: Loading compiled-in module X.509 certificates
Dec  6 04:00:28 np0005548918 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec  6 04:00:28 np0005548918 kernel: ima: Allocated hash algorithm: sha256
Dec  6 04:00:28 np0005548918 kernel: ima: No architecture policies found
Dec  6 04:00:28 np0005548918 kernel: evm: Initialising EVM extended attributes:
Dec  6 04:00:28 np0005548918 kernel: evm: security.selinux
Dec  6 04:00:28 np0005548918 kernel: evm: security.SMACK64 (disabled)
Dec  6 04:00:28 np0005548918 kernel: evm: security.SMACK64EXEC (disabled)
Dec  6 04:00:28 np0005548918 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec  6 04:00:28 np0005548918 kernel: evm: security.SMACK64MMAP (disabled)
Dec  6 04:00:28 np0005548918 kernel: evm: security.apparmor (disabled)
Dec  6 04:00:28 np0005548918 kernel: evm: security.ima
Dec  6 04:00:28 np0005548918 kernel: evm: security.capability
Dec  6 04:00:28 np0005548918 kernel: evm: HMAC attrs: 0x1
Dec  6 04:00:28 np0005548918 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec  6 04:00:28 np0005548918 kernel: Running certificate verification RSA selftest
Dec  6 04:00:28 np0005548918 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec  6 04:00:28 np0005548918 kernel: Running certificate verification ECDSA selftest
Dec  6 04:00:28 np0005548918 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec  6 04:00:28 np0005548918 kernel: clk: Disabling unused clocks
Dec  6 04:00:28 np0005548918 kernel: Freeing unused decrypted memory: 2028K
Dec  6 04:00:28 np0005548918 kernel: Freeing unused kernel image (initmem) memory: 4196K
Dec  6 04:00:28 np0005548918 kernel: Write protecting the kernel read-only data: 30720k
Dec  6 04:00:28 np0005548918 kernel: Freeing unused kernel image (rodata/data gap) memory: 428K
Dec  6 04:00:28 np0005548918 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec  6 04:00:28 np0005548918 kernel: Run /init as init process
Dec  6 04:00:28 np0005548918 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec  6 04:00:28 np0005548918 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec  6 04:00:28 np0005548918 kernel: usb 1-1: Product: QEMU USB Tablet
Dec  6 04:00:28 np0005548918 kernel: usb 1-1: Manufacturer: QEMU
Dec  6 04:00:28 np0005548918 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec  6 04:00:28 np0005548918 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  6 04:00:28 np0005548918 systemd: Detected virtualization kvm.
Dec  6 04:00:28 np0005548918 systemd: Detected architecture x86-64.
Dec  6 04:00:28 np0005548918 systemd: Running in initrd.
Dec  6 04:00:28 np0005548918 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec  6 04:00:28 np0005548918 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec  6 04:00:28 np0005548918 systemd: No hostname configured, using default hostname.
Dec  6 04:00:28 np0005548918 systemd: Hostname set to <localhost>.
Dec  6 04:00:28 np0005548918 systemd: Initializing machine ID from VM UUID.
Dec  6 04:00:28 np0005548918 systemd: Queued start job for default target Initrd Default Target.
Dec  6 04:00:28 np0005548918 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  6 04:00:28 np0005548918 systemd: Reached target Local Encrypted Volumes.
Dec  6 04:00:28 np0005548918 systemd: Reached target Initrd /usr File System.
Dec  6 04:00:28 np0005548918 systemd: Reached target Local File Systems.
Dec  6 04:00:28 np0005548918 systemd: Reached target Path Units.
Dec  6 04:00:28 np0005548918 systemd: Reached target Slice Units.
Dec  6 04:00:28 np0005548918 systemd: Reached target Swaps.
Dec  6 04:00:28 np0005548918 systemd: Reached target Timer Units.
Dec  6 04:00:28 np0005548918 systemd: Listening on D-Bus System Message Bus Socket.
Dec  6 04:00:28 np0005548918 systemd: Listening on Journal Socket (/dev/log).
Dec  6 04:00:28 np0005548918 systemd: Listening on Journal Socket.
Dec  6 04:00:28 np0005548918 systemd: Listening on udev Control Socket.
Dec  6 04:00:28 np0005548918 systemd: Listening on udev Kernel Socket.
Dec  6 04:00:28 np0005548918 systemd: Reached target Socket Units.
Dec  6 04:00:28 np0005548918 systemd: Starting Create List of Static Device Nodes...
Dec  6 04:00:28 np0005548918 systemd: Starting Journal Service...
Dec  6 04:00:28 np0005548918 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  6 04:00:28 np0005548918 systemd: Starting Apply Kernel Variables...
Dec  6 04:00:28 np0005548918 systemd: Starting Create System Users...
Dec  6 04:00:28 np0005548918 systemd: Starting Setup Virtual Console...
Dec  6 04:00:28 np0005548918 systemd: Finished Create List of Static Device Nodes.
Dec  6 04:00:28 np0005548918 systemd: Finished Apply Kernel Variables.
Dec  6 04:00:28 np0005548918 systemd: Finished Create System Users.
Dec  6 04:00:28 np0005548918 systemd-journald[308]: Journal started
Dec  6 04:00:28 np0005548918 systemd-journald[308]: Runtime Journal (/run/log/journal/9abb64abb4a84a6994923018ef71c6f2) is 8.0M, max 153.6M, 145.6M free.
Dec  6 04:00:28 np0005548918 systemd-sysusers[313]: Creating group 'users' with GID 100.
Dec  6 04:00:28 np0005548918 systemd-sysusers[313]: Creating group 'dbus' with GID 81.
Dec  6 04:00:28 np0005548918 systemd-sysusers[313]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec  6 04:00:28 np0005548918 systemd: Started Journal Service.
Dec  6 04:00:28 np0005548918 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec  6 04:00:28 np0005548918 systemd[1]: Starting Create Volatile Files and Directories...
Dec  6 04:00:28 np0005548918 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  6 04:00:28 np0005548918 systemd[1]: Finished Setup Virtual Console.
Dec  6 04:00:28 np0005548918 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec  6 04:00:28 np0005548918 systemd[1]: Starting dracut cmdline hook...
Dec  6 04:00:28 np0005548918 dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Dec  6 04:00:28 np0005548918 systemd[1]: Finished Create Volatile Files and Directories.
Dec  6 04:00:28 np0005548918 dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  6 04:00:28 np0005548918 systemd[1]: Finished dracut cmdline hook.
Dec  6 04:00:28 np0005548918 systemd[1]: Starting dracut pre-udev hook...
Dec  6 04:00:28 np0005548918 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec  6 04:00:28 np0005548918 kernel: device-mapper: uevent: version 1.0.3
Dec  6 04:00:28 np0005548918 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec  6 04:00:28 np0005548918 kernel: RPC: Registered named UNIX socket transport module.
Dec  6 04:00:28 np0005548918 kernel: RPC: Registered udp transport module.
Dec  6 04:00:28 np0005548918 kernel: RPC: Registered tcp transport module.
Dec  6 04:00:28 np0005548918 kernel: RPC: Registered tcp-with-tls transport module.
Dec  6 04:00:28 np0005548918 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec  6 04:00:28 np0005548918 rpc.statd[446]: Version 2.5.4 starting
Dec  6 04:00:28 np0005548918 rpc.statd[446]: Initializing NSM state
Dec  6 04:00:28 np0005548918 rpc.idmapd[451]: Setting log level to 0
Dec  6 04:00:28 np0005548918 systemd[1]: Finished dracut pre-udev hook.
Dec  6 04:00:28 np0005548918 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  6 04:00:28 np0005548918 systemd-udevd[464]: Using default interface naming scheme 'rhel-9.0'.
Dec  6 04:00:28 np0005548918 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  6 04:00:28 np0005548918 systemd[1]: Starting dracut pre-trigger hook...
Dec  6 04:00:28 np0005548918 systemd[1]: Finished dracut pre-trigger hook.
Dec  6 04:00:28 np0005548918 systemd[1]: Starting Coldplug All udev Devices...
Dec  6 04:00:28 np0005548918 systemd[1]: Created slice Slice /system/modprobe.
Dec  6 04:00:28 np0005548918 systemd[1]: Starting Load Kernel Module configfs...
Dec  6 04:00:28 np0005548918 systemd[1]: Finished Coldplug All udev Devices.
Dec  6 04:00:28 np0005548918 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  6 04:00:28 np0005548918 systemd[1]: Finished Load Kernel Module configfs.
Dec  6 04:00:28 np0005548918 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  6 04:00:28 np0005548918 systemd[1]: Reached target Network.
Dec  6 04:00:28 np0005548918 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  6 04:00:28 np0005548918 systemd[1]: Starting dracut initqueue hook...
Dec  6 04:00:28 np0005548918 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec  6 04:00:28 np0005548918 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec  6 04:00:28 np0005548918 kernel: vda: vda1
Dec  6 04:00:28 np0005548918 kernel: scsi host0: ata_piix
Dec  6 04:00:28 np0005548918 kernel: scsi host1: ata_piix
Dec  6 04:00:28 np0005548918 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec  6 04:00:28 np0005548918 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec  6 04:00:28 np0005548918 systemd[1]: Found device /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec  6 04:00:28 np0005548918 systemd[1]: Reached target Initrd Root Device.
Dec  6 04:00:29 np0005548918 systemd[1]: Mounting Kernel Configuration File System...
Dec  6 04:00:29 np0005548918 kernel: ata1: found unknown device (class 0)
Dec  6 04:00:29 np0005548918 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec  6 04:00:29 np0005548918 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec  6 04:00:29 np0005548918 systemd-udevd[496]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 04:00:29 np0005548918 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec  6 04:00:29 np0005548918 systemd[1]: Mounted Kernel Configuration File System.
Dec  6 04:00:29 np0005548918 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec  6 04:00:29 np0005548918 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec  6 04:00:29 np0005548918 systemd[1]: Reached target System Initialization.
Dec  6 04:00:29 np0005548918 systemd[1]: Reached target Basic System.
Dec  6 04:00:29 np0005548918 systemd[1]: Finished dracut initqueue hook.
Dec  6 04:00:29 np0005548918 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  6 04:00:29 np0005548918 systemd[1]: Reached target Remote Encrypted Volumes.
Dec  6 04:00:29 np0005548918 systemd[1]: Reached target Remote File Systems.
Dec  6 04:00:29 np0005548918 systemd[1]: Starting dracut pre-mount hook...
Dec  6 04:00:29 np0005548918 systemd[1]: Finished dracut pre-mount hook.
Dec  6 04:00:29 np0005548918 systemd[1]: Starting File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f...
Dec  6 04:00:29 np0005548918 systemd-fsck[559]: /usr/sbin/fsck.xfs: XFS file system.
Dec  6 04:00:29 np0005548918 systemd[1]: Finished File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec  6 04:00:29 np0005548918 systemd[1]: Mounting /sysroot...
Dec  6 04:00:29 np0005548918 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec  6 04:00:29 np0005548918 kernel: XFS (vda1): Mounting V5 Filesystem fcf6b761-831a-48a7-9f5f-068b5063763f
Dec  6 04:00:29 np0005548918 kernel: XFS (vda1): Ending clean mount
Dec  6 04:00:29 np0005548918 systemd[1]: Mounted /sysroot.
Dec  6 04:00:29 np0005548918 systemd[1]: Reached target Initrd Root File System.
Dec  6 04:00:29 np0005548918 systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec  6 04:00:29 np0005548918 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec  6 04:00:29 np0005548918 systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec  6 04:00:29 np0005548918 systemd[1]: Reached target Initrd File Systems.
Dec  6 04:00:29 np0005548918 systemd[1]: Reached target Initrd Default Target.
Dec  6 04:00:29 np0005548918 systemd[1]: Starting dracut mount hook...
Dec  6 04:00:29 np0005548918 systemd[1]: Finished dracut mount hook.
Dec  6 04:00:29 np0005548918 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec  6 04:00:30 np0005548918 rpc.idmapd[451]: exiting on signal 15
Dec  6 04:00:30 np0005548918 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec  6 04:00:30 np0005548918 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Network.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Remote Encrypted Volumes.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Timer Units.
Dec  6 04:00:30 np0005548918 systemd[1]: dbus.socket: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Closed D-Bus System Message Bus Socket.
Dec  6 04:00:30 np0005548918 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Initrd Default Target.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Basic System.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Initrd Root Device.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Initrd /usr File System.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Path Units.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Remote File Systems.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Preparation for Remote File Systems.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Slice Units.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Socket Units.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target System Initialization.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Local File Systems.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Swaps.
Dec  6 04:00:30 np0005548918 systemd[1]: dracut-mount.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped dracut mount hook.
Dec  6 04:00:30 np0005548918 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped dracut pre-mount hook.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped target Local Encrypted Volumes.
Dec  6 04:00:30 np0005548918 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec  6 04:00:30 np0005548918 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped dracut initqueue hook.
Dec  6 04:00:30 np0005548918 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped Apply Kernel Variables.
Dec  6 04:00:30 np0005548918 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped Create Volatile Files and Directories.
Dec  6 04:00:30 np0005548918 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped Coldplug All udev Devices.
Dec  6 04:00:30 np0005548918 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped dracut pre-trigger hook.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec  6 04:00:30 np0005548918 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped Setup Virtual Console.
Dec  6 04:00:30 np0005548918 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec  6 04:00:30 np0005548918 systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec  6 04:00:30 np0005548918 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Closed udev Control Socket.
Dec  6 04:00:30 np0005548918 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Closed udev Kernel Socket.
Dec  6 04:00:30 np0005548918 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped dracut pre-udev hook.
Dec  6 04:00:30 np0005548918 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped dracut cmdline hook.
Dec  6 04:00:30 np0005548918 systemd[1]: Starting Cleanup udev Database...
Dec  6 04:00:30 np0005548918 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec  6 04:00:30 np0005548918 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped Create List of Static Device Nodes.
Dec  6 04:00:30 np0005548918 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Stopped Create System Users.
Dec  6 04:00:30 np0005548918 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Finished Cleanup udev Database.
Dec  6 04:00:30 np0005548918 systemd[1]: Reached target Switch Root.
Dec  6 04:00:30 np0005548918 systemd[1]: Starting Switch Root...
Dec  6 04:00:30 np0005548918 systemd[1]: Switching root.
Dec  6 04:00:30 np0005548918 systemd-journald[308]: Received SIGTERM from PID 1 (systemd).
Dec  6 04:00:30 np0005548918 systemd-journald[308]: Journal stopped
Dec  6 04:00:30 np0005548918 kernel: audit: type=1404 audit(1765011630.298:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec  6 04:00:30 np0005548918 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:00:30 np0005548918 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:00:30 np0005548918 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:00:30 np0005548918 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:00:30 np0005548918 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:00:30 np0005548918 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:00:30 np0005548918 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:00:30 np0005548918 kernel: audit: type=1403 audit(1765011630.439:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec  6 04:00:30 np0005548918 systemd: Successfully loaded SELinux policy in 144.732ms.
Dec  6 04:00:30 np0005548918 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 35.621ms.
Dec  6 04:00:30 np0005548918 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  6 04:00:30 np0005548918 systemd: Detected virtualization kvm.
Dec  6 04:00:30 np0005548918 systemd: Detected architecture x86-64.
Dec  6 04:00:30 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:00:30 np0005548918 systemd: initrd-switch-root.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd: Stopped Switch Root.
Dec  6 04:00:30 np0005548918 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec  6 04:00:30 np0005548918 systemd: Created slice Slice /system/getty.
Dec  6 04:00:30 np0005548918 systemd: Created slice Slice /system/serial-getty.
Dec  6 04:00:30 np0005548918 systemd: Created slice Slice /system/sshd-keygen.
Dec  6 04:00:30 np0005548918 systemd: Created slice User and Session Slice.
Dec  6 04:00:30 np0005548918 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  6 04:00:30 np0005548918 systemd: Started Forward Password Requests to Wall Directory Watch.
Dec  6 04:00:30 np0005548918 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec  6 04:00:30 np0005548918 systemd: Reached target Local Encrypted Volumes.
Dec  6 04:00:30 np0005548918 systemd: Stopped target Switch Root.
Dec  6 04:00:30 np0005548918 systemd: Stopped target Initrd File Systems.
Dec  6 04:00:30 np0005548918 systemd: Stopped target Initrd Root File System.
Dec  6 04:00:30 np0005548918 systemd: Reached target Local Integrity Protected Volumes.
Dec  6 04:00:30 np0005548918 systemd: Reached target Path Units.
Dec  6 04:00:30 np0005548918 systemd: Reached target rpc_pipefs.target.
Dec  6 04:00:30 np0005548918 systemd: Reached target Slice Units.
Dec  6 04:00:30 np0005548918 systemd: Reached target Swaps.
Dec  6 04:00:30 np0005548918 systemd: Reached target Local Verity Protected Volumes.
Dec  6 04:00:30 np0005548918 systemd: Listening on RPCbind Server Activation Socket.
Dec  6 04:00:30 np0005548918 systemd: Reached target RPC Port Mapper.
Dec  6 04:00:30 np0005548918 systemd: Listening on Process Core Dump Socket.
Dec  6 04:00:30 np0005548918 systemd: Listening on initctl Compatibility Named Pipe.
Dec  6 04:00:30 np0005548918 systemd: Listening on udev Control Socket.
Dec  6 04:00:30 np0005548918 systemd: Listening on udev Kernel Socket.
Dec  6 04:00:30 np0005548918 systemd: Mounting Huge Pages File System...
Dec  6 04:00:30 np0005548918 systemd: Mounting POSIX Message Queue File System...
Dec  6 04:00:30 np0005548918 systemd: Mounting Kernel Debug File System...
Dec  6 04:00:30 np0005548918 systemd: Mounting Kernel Trace File System...
Dec  6 04:00:30 np0005548918 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  6 04:00:30 np0005548918 systemd: Starting Create List of Static Device Nodes...
Dec  6 04:00:30 np0005548918 systemd: Starting Load Kernel Module configfs...
Dec  6 04:00:30 np0005548918 systemd: Starting Load Kernel Module drm...
Dec  6 04:00:30 np0005548918 systemd: Starting Load Kernel Module efi_pstore...
Dec  6 04:00:30 np0005548918 systemd: Starting Load Kernel Module fuse...
Dec  6 04:00:30 np0005548918 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec  6 04:00:30 np0005548918 systemd: systemd-fsck-root.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd: Stopped File System Check on Root Device.
Dec  6 04:00:30 np0005548918 systemd: Stopped Journal Service.
Dec  6 04:00:30 np0005548918 systemd: Starting Journal Service...
Dec  6 04:00:30 np0005548918 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  6 04:00:30 np0005548918 systemd: Starting Generate network units from Kernel command line...
Dec  6 04:00:30 np0005548918 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  6 04:00:30 np0005548918 systemd: Starting Remount Root and Kernel File Systems...
Dec  6 04:00:30 np0005548918 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec  6 04:00:30 np0005548918 systemd: Starting Apply Kernel Variables...
Dec  6 04:00:30 np0005548918 kernel: fuse: init (API version 7.37)
Dec  6 04:00:30 np0005548918 systemd: Starting Coldplug All udev Devices...
Dec  6 04:00:30 np0005548918 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec  6 04:00:30 np0005548918 systemd: Mounted Huge Pages File System.
Dec  6 04:00:30 np0005548918 systemd: Mounted POSIX Message Queue File System.
Dec  6 04:00:30 np0005548918 systemd: Mounted Kernel Debug File System.
Dec  6 04:00:30 np0005548918 systemd: Mounted Kernel Trace File System.
Dec  6 04:00:30 np0005548918 systemd: Finished Create List of Static Device Nodes.
Dec  6 04:00:30 np0005548918 systemd: modprobe@configfs.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd: Finished Load Kernel Module configfs.
Dec  6 04:00:30 np0005548918 systemd: modprobe@efi_pstore.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd: Finished Load Kernel Module efi_pstore.
Dec  6 04:00:30 np0005548918 systemd-journald[686]: Journal started
Dec  6 04:00:30 np0005548918 systemd-journald[686]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec  6 04:00:30 np0005548918 systemd[1]: Queued start job for default target Multi-User System.
Dec  6 04:00:30 np0005548918 systemd[1]: systemd-journald.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd: Started Journal Service.
Dec  6 04:00:30 np0005548918 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Finished Load Kernel Module fuse.
Dec  6 04:00:30 np0005548918 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec  6 04:00:30 np0005548918 systemd[1]: Finished Generate network units from Kernel command line.
Dec  6 04:00:30 np0005548918 systemd[1]: Finished Remount Root and Kernel File Systems.
Dec  6 04:00:30 np0005548918 systemd[1]: Finished Apply Kernel Variables.
Dec  6 04:00:30 np0005548918 kernel: ACPI: bus type drm_connector registered
Dec  6 04:00:30 np0005548918 systemd[1]: Mounting FUSE Control File System...
Dec  6 04:00:30 np0005548918 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  6 04:00:30 np0005548918 systemd[1]: Starting Rebuild Hardware Database...
Dec  6 04:00:30 np0005548918 systemd[1]: Starting Flush Journal to Persistent Storage...
Dec  6 04:00:30 np0005548918 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec  6 04:00:30 np0005548918 systemd[1]: Starting Load/Save OS Random Seed...
Dec  6 04:00:30 np0005548918 systemd[1]: Starting Create System Users...
Dec  6 04:00:30 np0005548918 systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec  6 04:00:30 np0005548918 systemd[1]: Finished Load Kernel Module drm.
Dec  6 04:00:31 np0005548918 systemd[1]: Mounted FUSE Control File System.
Dec  6 04:00:31 np0005548918 systemd-journald[686]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec  6 04:00:31 np0005548918 systemd-journald[686]: Received client request to flush runtime journal.
Dec  6 04:00:31 np0005548918 systemd[1]: Finished Flush Journal to Persistent Storage.
Dec  6 04:00:31 np0005548918 systemd[1]: Finished Load/Save OS Random Seed.
Dec  6 04:00:31 np0005548918 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  6 04:00:31 np0005548918 systemd[1]: Finished Coldplug All udev Devices.
Dec  6 04:00:31 np0005548918 systemd[1]: Finished Create System Users.
Dec  6 04:00:31 np0005548918 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec  6 04:00:31 np0005548918 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  6 04:00:31 np0005548918 systemd[1]: Reached target Preparation for Local File Systems.
Dec  6 04:00:31 np0005548918 systemd[1]: Reached target Local File Systems.
Dec  6 04:00:31 np0005548918 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec  6 04:00:31 np0005548918 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec  6 04:00:31 np0005548918 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec  6 04:00:31 np0005548918 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec  6 04:00:31 np0005548918 systemd[1]: Starting Automatic Boot Loader Update...
Dec  6 04:00:31 np0005548918 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec  6 04:00:31 np0005548918 systemd[1]: Starting Create Volatile Files and Directories...
Dec  6 04:00:31 np0005548918 bootctl[702]: Couldn't find EFI system partition, skipping.
Dec  6 04:00:31 np0005548918 systemd[1]: Finished Automatic Boot Loader Update.
Dec  6 04:00:31 np0005548918 systemd[1]: Finished Create Volatile Files and Directories.
Dec  6 04:00:31 np0005548918 systemd[1]: Starting Security Auditing Service...
Dec  6 04:00:31 np0005548918 systemd[1]: Starting RPC Bind...
Dec  6 04:00:31 np0005548918 systemd[1]: Starting Rebuild Journal Catalog...
Dec  6 04:00:31 np0005548918 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec  6 04:00:31 np0005548918 auditd[708]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec  6 04:00:31 np0005548918 auditd[708]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec  6 04:00:31 np0005548918 systemd[1]: Started RPC Bind.
Dec  6 04:00:31 np0005548918 systemd[1]: Finished Rebuild Journal Catalog.
Dec  6 04:00:31 np0005548918 augenrules[713]: /sbin/augenrules: No change
Dec  6 04:00:31 np0005548918 augenrules[728]: No rules
Dec  6 04:00:31 np0005548918 augenrules[728]: enabled 1
Dec  6 04:00:31 np0005548918 augenrules[728]: failure 1
Dec  6 04:00:31 np0005548918 augenrules[728]: pid 708
Dec  6 04:00:31 np0005548918 augenrules[728]: rate_limit 0
Dec  6 04:00:31 np0005548918 augenrules[728]: backlog_limit 8192
Dec  6 04:00:31 np0005548918 augenrules[728]: lost 0
Dec  6 04:00:31 np0005548918 augenrules[728]: backlog 1
Dec  6 04:00:31 np0005548918 augenrules[728]: backlog_wait_time 60000
Dec  6 04:00:31 np0005548918 augenrules[728]: backlog_wait_time_actual 0
Dec  6 04:00:31 np0005548918 augenrules[728]: enabled 1
Dec  6 04:00:31 np0005548918 augenrules[728]: failure 1
Dec  6 04:00:31 np0005548918 augenrules[728]: pid 708
Dec  6 04:00:31 np0005548918 augenrules[728]: rate_limit 0
Dec  6 04:00:31 np0005548918 augenrules[728]: backlog_limit 8192
Dec  6 04:00:31 np0005548918 augenrules[728]: lost 0
Dec  6 04:00:31 np0005548918 augenrules[728]: backlog 0
Dec  6 04:00:31 np0005548918 augenrules[728]: backlog_wait_time 60000
Dec  6 04:00:31 np0005548918 augenrules[728]: backlog_wait_time_actual 0
Dec  6 04:00:31 np0005548918 augenrules[728]: enabled 1
Dec  6 04:00:31 np0005548918 augenrules[728]: failure 1
Dec  6 04:00:31 np0005548918 augenrules[728]: pid 708
Dec  6 04:00:31 np0005548918 augenrules[728]: rate_limit 0
Dec  6 04:00:31 np0005548918 augenrules[728]: backlog_limit 8192
Dec  6 04:00:31 np0005548918 augenrules[728]: lost 0
Dec  6 04:00:31 np0005548918 augenrules[728]: backlog 1
Dec  6 04:00:31 np0005548918 augenrules[728]: backlog_wait_time 60000
Dec  6 04:00:31 np0005548918 augenrules[728]: backlog_wait_time_actual 0
Dec  6 04:00:31 np0005548918 systemd[1]: Started Security Auditing Service.
Dec  6 04:00:31 np0005548918 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec  6 04:00:31 np0005548918 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec  6 04:00:31 np0005548918 systemd[1]: Finished Rebuild Hardware Database.
Dec  6 04:00:31 np0005548918 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  6 04:00:31 np0005548918 systemd[1]: Starting Update is Completed...
Dec  6 04:00:31 np0005548918 systemd[1]: Finished Update is Completed.
Dec  6 04:00:31 np0005548918 systemd-udevd[736]: Using default interface naming scheme 'rhel-9.0'.
Dec  6 04:00:31 np0005548918 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  6 04:00:31 np0005548918 systemd[1]: Reached target System Initialization.
Dec  6 04:00:31 np0005548918 systemd[1]: Started dnf makecache --timer.
Dec  6 04:00:31 np0005548918 systemd[1]: Started Daily rotation of log files.
Dec  6 04:00:31 np0005548918 systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec  6 04:00:31 np0005548918 systemd[1]: Reached target Timer Units.
Dec  6 04:00:31 np0005548918 systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec  6 04:00:31 np0005548918 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec  6 04:00:31 np0005548918 systemd[1]: Reached target Socket Units.
Dec  6 04:00:31 np0005548918 systemd[1]: Starting D-Bus System Message Bus...
Dec  6 04:00:31 np0005548918 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  6 04:00:31 np0005548918 systemd[1]: Starting Load Kernel Module configfs...
Dec  6 04:00:31 np0005548918 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec  6 04:00:31 np0005548918 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  6 04:00:31 np0005548918 systemd[1]: Finished Load Kernel Module configfs.
Dec  6 04:00:31 np0005548918 systemd[1]: Started D-Bus System Message Bus.
Dec  6 04:00:31 np0005548918 systemd[1]: Reached target Basic System.
Dec  6 04:00:31 np0005548918 dbus-broker-lau[747]: Ready
Dec  6 04:00:31 np0005548918 systemd-udevd[748]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 04:00:31 np0005548918 systemd[1]: Starting NTP client/server...
Dec  6 04:00:31 np0005548918 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec  6 04:00:31 np0005548918 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec  6 04:00:31 np0005548918 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec  6 04:00:31 np0005548918 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec  6 04:00:31 np0005548918 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec  6 04:00:31 np0005548918 chronyd[789]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  6 04:00:31 np0005548918 chronyd[789]: Loaded 0 symmetric keys
Dec  6 04:00:31 np0005548918 chronyd[789]: Using right/UTC timezone to obtain leap second data
Dec  6 04:00:31 np0005548918 chronyd[789]: Loaded seccomp filter (level 2)
Dec  6 04:00:31 np0005548918 systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec  6 04:00:31 np0005548918 systemd[1]: Starting IPv4 firewall with iptables...
Dec  6 04:00:31 np0005548918 systemd[1]: Started irqbalance daemon.
Dec  6 04:00:31 np0005548918 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec  6 04:00:31 np0005548918 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  6 04:00:31 np0005548918 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  6 04:00:31 np0005548918 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  6 04:00:31 np0005548918 systemd[1]: Reached target sshd-keygen.target.
Dec  6 04:00:31 np0005548918 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec  6 04:00:31 np0005548918 systemd[1]: Reached target User and Group Name Lookups.
Dec  6 04:00:31 np0005548918 systemd[1]: Starting User Login Management...
Dec  6 04:00:31 np0005548918 systemd[1]: Started NTP client/server.
Dec  6 04:00:31 np0005548918 systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec  6 04:00:31 np0005548918 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec  6 04:00:31 np0005548918 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec  6 04:00:31 np0005548918 kernel: Console: switching to colour dummy device 80x25
Dec  6 04:00:31 np0005548918 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec  6 04:00:31 np0005548918 kernel: [drm] features: -context_init
Dec  6 04:00:31 np0005548918 kernel: [drm] number of scanouts: 1
Dec  6 04:00:31 np0005548918 kernel: [drm] number of cap sets: 0
Dec  6 04:00:31 np0005548918 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec  6 04:00:31 np0005548918 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec  6 04:00:31 np0005548918 kernel: Console: switching to colour frame buffer device 128x48
Dec  6 04:00:31 np0005548918 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec  6 04:00:31 np0005548918 kernel: kvm_amd: TSC scaling supported
Dec  6 04:00:31 np0005548918 kernel: kvm_amd: Nested Virtualization enabled
Dec  6 04:00:31 np0005548918 kernel: kvm_amd: Nested Paging enabled
Dec  6 04:00:31 np0005548918 kernel: kvm_amd: LBR virtualization supported
Dec  6 04:00:32 np0005548918 systemd-logind[800]: New seat seat0.
Dec  6 04:00:32 np0005548918 systemd-logind[800]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  6 04:00:32 np0005548918 systemd-logind[800]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  6 04:00:32 np0005548918 systemd[1]: Started User Login Management.
Dec  6 04:00:32 np0005548918 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec  6 04:00:32 np0005548918 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec  6 04:00:32 np0005548918 iptables.init[794]: iptables: Applying firewall rules: [  OK  ]
Dec  6 04:00:32 np0005548918 systemd[1]: Finished IPv4 firewall with iptables.
Dec  6 04:00:32 np0005548918 cloud-init[845]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 06 Dec 2025 09:00:32 +0000. Up 5.92 seconds.
Dec  6 04:00:32 np0005548918 systemd[1]: run-cloud\x2dinit-tmp-tmp0df8z3dg.mount: Deactivated successfully.
Dec  6 04:00:32 np0005548918 systemd[1]: Starting Hostname Service...
Dec  6 04:00:32 np0005548918 systemd[1]: Started Hostname Service.
Dec  6 04:00:32 np0005548918 systemd-hostnamed[859]: Hostname set to <np0005548918.novalocal> (static)
Dec  6 04:00:32 np0005548918 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec  6 04:00:32 np0005548918 systemd[1]: Reached target Preparation for Network.
Dec  6 04:00:32 np0005548918 systemd[1]: Starting Network Manager...
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.7956] NetworkManager (version 1.54.1-1.el9) is starting... (boot:6ee0f712-c10e-4848-9a0e-9942073e400e)
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.7961] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8037] manager[0x55add24ee080]: monitoring kernel firmware directory '/lib/firmware'.
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8073] hostname: hostname: using hostnamed
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8073] hostname: static hostname changed from (none) to "np0005548918.novalocal"
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8077] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8168] manager[0x55add24ee080]: rfkill: Wi-Fi hardware radio set enabled
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8169] manager[0x55add24ee080]: rfkill: WWAN hardware radio set enabled
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8208] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8208] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8209] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8209] manager: Networking is enabled by state file
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8211] settings: Loaded settings plugin: keyfile (internal)
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8219] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8239] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8250] dhcp: init: Using DHCP client 'internal'
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8252] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8264] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  6 04:00:32 np0005548918 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8270] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8276] device (lo): Activation: starting connection 'lo' (ff720ffb-c083-491a-b3ff-e737ba278b15)
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8285] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8287] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8334] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8339] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8341] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8343] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8346] device (eth0): carrier: link connected
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8349] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8356] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8362] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8367] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8367] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8370] manager: NetworkManager state is now CONNECTING
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8371] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8378] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8381] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:00:32 np0005548918 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  6 04:00:32 np0005548918 systemd[1]: Started Network Manager.
Dec  6 04:00:32 np0005548918 systemd[1]: Reached target Network.
Dec  6 04:00:32 np0005548918 systemd[1]: Starting Network Manager Wait Online...
Dec  6 04:00:32 np0005548918 systemd[1]: Starting GSSAPI Proxy Daemon...
Dec  6 04:00:32 np0005548918 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8737] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8740] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  6 04:00:32 np0005548918 NetworkManager[863]: <info>  [1765011632.8747] device (lo): Activation: successful, device activated.
Dec  6 04:00:32 np0005548918 systemd[1]: Started GSSAPI Proxy Daemon.
Dec  6 04:00:32 np0005548918 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  6 04:00:32 np0005548918 systemd[1]: Reached target NFS client services.
Dec  6 04:00:32 np0005548918 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  6 04:00:32 np0005548918 systemd[1]: Reached target Remote File Systems.
Dec  6 04:00:32 np0005548918 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  6 04:00:34 np0005548918 NetworkManager[863]: <info>  [1765011634.1660] dhcp4 (eth0): state changed new lease, address=38.102.83.74
Dec  6 04:00:34 np0005548918 NetworkManager[863]: <info>  [1765011634.1673] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  6 04:00:34 np0005548918 NetworkManager[863]: <info>  [1765011634.1693] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:00:34 np0005548918 NetworkManager[863]: <info>  [1765011634.1722] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:00:34 np0005548918 NetworkManager[863]: <info>  [1765011634.1723] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:00:34 np0005548918 NetworkManager[863]: <info>  [1765011634.1725] manager: NetworkManager state is now CONNECTED_SITE
Dec  6 04:00:34 np0005548918 NetworkManager[863]: <info>  [1765011634.1727] device (eth0): Activation: successful, device activated.
Dec  6 04:00:34 np0005548918 NetworkManager[863]: <info>  [1765011634.1730] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  6 04:00:34 np0005548918 NetworkManager[863]: <info>  [1765011634.1732] manager: startup complete
Dec  6 04:00:34 np0005548918 systemd[1]: Finished Network Manager Wait Online.
Dec  6 04:00:34 np0005548918 systemd[1]: Starting Cloud-init: Network Stage...
Dec  6 04:00:34 np0005548918 cloud-init[926]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 06 Dec 2025 09:00:34 +0000. Up 8.13 seconds.
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: |  eth0  | True |         38.102.83.74         | 255.255.255.0 | global | fa:16:3e:c0:e3:66 |
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: |  eth0  | True | fe80::f816:3eff:fec0:e366/64 |       .       |  link  | fa:16:3e:c0:e3:66 |
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec  6 04:00:34 np0005548918 cloud-init[926]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  6 04:00:35 np0005548918 cloud-init[926]: Generating public/private rsa key pair.
Dec  6 04:00:35 np0005548918 cloud-init[926]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec  6 04:00:35 np0005548918 cloud-init[926]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec  6 04:00:35 np0005548918 cloud-init[926]: The key fingerprint is:
Dec  6 04:00:35 np0005548918 cloud-init[926]: SHA256:6vJ82ifUTz4ZRIk3rpZrcf5bG1CtJAWstKjWRlpu4qI root@np0005548918.novalocal
Dec  6 04:00:35 np0005548918 cloud-init[926]: The key's randomart image is:
Dec  6 04:00:35 np0005548918 cloud-init[926]: +---[RSA 3072]----+
Dec  6 04:00:35 np0005548918 cloud-init[926]: |           o.o.  |
Dec  6 04:00:35 np0005548918 cloud-init[926]: |          o *.  .|
Dec  6 04:00:35 np0005548918 cloud-init[926]: |         o *.....|
Dec  6 04:00:35 np0005548918 cloud-init[926]: |        + o oo.. |
Dec  6 04:00:35 np0005548918 cloud-init[926]: |       BS. + ..  |
Dec  6 04:00:35 np0005548918 cloud-init[926]: |      =.* * + .  |
Dec  6 04:00:35 np0005548918 cloud-init[926]: |     o.= . O o ..|
Dec  6 04:00:35 np0005548918 cloud-init[926]: |    oo..o + *  .o|
Dec  6 04:00:35 np0005548918 cloud-init[926]: |  E. +=o.+   ooo |
Dec  6 04:00:35 np0005548918 cloud-init[926]: +----[SHA256]-----+
Dec  6 04:00:35 np0005548918 cloud-init[926]: Generating public/private ecdsa key pair.
Dec  6 04:00:35 np0005548918 cloud-init[926]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec  6 04:00:35 np0005548918 cloud-init[926]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec  6 04:00:35 np0005548918 cloud-init[926]: The key fingerprint is:
Dec  6 04:00:35 np0005548918 cloud-init[926]: SHA256:803zxDC50tPNKltRSTro1XHCtJQmlz1RMhYiaPvsXnc root@np0005548918.novalocal
Dec  6 04:00:35 np0005548918 cloud-init[926]: The key's randomart image is:
Dec  6 04:00:35 np0005548918 cloud-init[926]: +---[ECDSA 256]---+
Dec  6 04:00:35 np0005548918 cloud-init[926]: |         .. .oOO=|
Dec  6 04:00:35 np0005548918 cloud-init[926]: |        o  .o=X*=|
Dec  6 04:00:35 np0005548918 cloud-init[926]: |       . . .+B.oo|
Dec  6 04:00:35 np0005548918 cloud-init[926]: |        . ...*.+ |
Dec  6 04:00:35 np0005548918 cloud-init[926]: |        So..* = o|
Dec  6 04:00:35 np0005548918 cloud-init[926]: |         oo+ = o |
Dec  6 04:00:35 np0005548918 cloud-init[926]: |         .. + = E|
Dec  6 04:00:35 np0005548918 cloud-init[926]: |          .. = . |
Dec  6 04:00:35 np0005548918 cloud-init[926]: |         .. .    |
Dec  6 04:00:35 np0005548918 cloud-init[926]: +----[SHA256]-----+
Dec  6 04:00:35 np0005548918 cloud-init[926]: Generating public/private ed25519 key pair.
Dec  6 04:00:35 np0005548918 cloud-init[926]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec  6 04:00:35 np0005548918 cloud-init[926]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec  6 04:00:35 np0005548918 cloud-init[926]: The key fingerprint is:
Dec  6 04:00:35 np0005548918 cloud-init[926]: SHA256:QzR/aDbP1gcucpwNb1ovZThmmMZcBAZzbc3yvNh+FrA root@np0005548918.novalocal
Dec  6 04:00:35 np0005548918 cloud-init[926]: The key's randomart image is:
Dec  6 04:00:35 np0005548918 cloud-init[926]: +--[ED25519 256]--+
Dec  6 04:00:35 np0005548918 cloud-init[926]: |        o o.+o.o |
Dec  6 04:00:35 np0005548918 cloud-init[926]: |       . o = .+ o|
Dec  6 04:00:35 np0005548918 cloud-init[926]: |        . * o.o+ |
Dec  6 04:00:35 np0005548918 cloud-init[926]: |       . o O % oo|
Dec  6 04:00:35 np0005548918 cloud-init[926]: |        S . # ^ =|
Dec  6 04:00:35 np0005548918 cloud-init[926]: |         . = E X |
Dec  6 04:00:35 np0005548918 cloud-init[926]: |            . o o|
Dec  6 04:00:35 np0005548918 cloud-init[926]: |               oo|
Dec  6 04:00:35 np0005548918 cloud-init[926]: |               ..|
Dec  6 04:00:35 np0005548918 cloud-init[926]: +----[SHA256]-----+
Dec  6 04:00:35 np0005548918 systemd[1]: Finished Cloud-init: Network Stage.
Dec  6 04:00:35 np0005548918 systemd[1]: Reached target Cloud-config availability.
Dec  6 04:00:35 np0005548918 systemd[1]: Reached target Network is Online.
Dec  6 04:00:35 np0005548918 systemd[1]: Starting Cloud-init: Config Stage...
Dec  6 04:00:35 np0005548918 systemd[1]: Starting Crash recovery kernel arming...
Dec  6 04:00:35 np0005548918 systemd[1]: Starting Notify NFS peers of a restart...
Dec  6 04:00:35 np0005548918 systemd[1]: Starting System Logging Service...
Dec  6 04:00:35 np0005548918 sm-notify[1010]: Version 2.5.4 starting
Dec  6 04:00:35 np0005548918 systemd[1]: Starting OpenSSH server daemon...
Dec  6 04:00:35 np0005548918 systemd[1]: Starting Permit User Sessions...
Dec  6 04:00:35 np0005548918 systemd[1]: Started Notify NFS peers of a restart.
Dec  6 04:00:35 np0005548918 systemd[1]: Started OpenSSH server daemon.
Dec  6 04:00:35 np0005548918 systemd[1]: Finished Permit User Sessions.
Dec  6 04:00:35 np0005548918 systemd[1]: Started Command Scheduler.
Dec  6 04:00:35 np0005548918 rsyslogd[1011]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1011" x-info="https://www.rsyslog.com"] start
Dec  6 04:00:35 np0005548918 rsyslogd[1011]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec  6 04:00:35 np0005548918 systemd[1]: Started Getty on tty1.
Dec  6 04:00:35 np0005548918 systemd[1]: Started Serial Getty on ttyS0.
Dec  6 04:00:35 np0005548918 systemd[1]: Reached target Login Prompts.
Dec  6 04:00:35 np0005548918 systemd[1]: Started System Logging Service.
Dec  6 04:00:35 np0005548918 systemd[1]: Reached target Multi-User System.
Dec  6 04:00:35 np0005548918 systemd[1]: Starting Record Runlevel Change in UTMP...
Dec  6 04:00:35 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:00:35 np0005548918 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec  6 04:00:35 np0005548918 systemd[1]: Finished Record Runlevel Change in UTMP.
Dec  6 04:00:35 np0005548918 kdumpctl[1018]: kdump: No kdump initial ramdisk found.
Dec  6 04:00:35 np0005548918 kdumpctl[1018]: kdump: Rebuilding /boot/initramfs-5.14.0-645.el9.x86_64kdump.img
Dec  6 04:00:36 np0005548918 cloud-init[1170]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 06 Dec 2025 09:00:36 +0000. Up 9.66 seconds.
Dec  6 04:00:36 np0005548918 systemd[1]: Finished Cloud-init: Config Stage.
Dec  6 04:00:36 np0005548918 systemd[1]: Starting Cloud-init: Final Stage...
Dec  6 04:00:36 np0005548918 dracut[1289]: dracut-057-102.git20250818.el9
Dec  6 04:00:36 np0005548918 dracut[1291]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-645.el9.x86_64kdump.img 5.14.0-645.el9.x86_64
Dec  6 04:00:36 np0005548918 cloud-init[1334]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 06 Dec 2025 09:00:36 +0000. Up 10.08 seconds.
Dec  6 04:00:36 np0005548918 cloud-init[1361]: #############################################################
Dec  6 04:00:36 np0005548918 cloud-init[1362]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec  6 04:00:36 np0005548918 cloud-init[1364]: 256 SHA256:803zxDC50tPNKltRSTro1XHCtJQmlz1RMhYiaPvsXnc root@np0005548918.novalocal (ECDSA)
Dec  6 04:00:36 np0005548918 cloud-init[1369]: 256 SHA256:QzR/aDbP1gcucpwNb1ovZThmmMZcBAZzbc3yvNh+FrA root@np0005548918.novalocal (ED25519)
Dec  6 04:00:36 np0005548918 cloud-init[1371]: 3072 SHA256:6vJ82ifUTz4ZRIk3rpZrcf5bG1CtJAWstKjWRlpu4qI root@np0005548918.novalocal (RSA)
Dec  6 04:00:36 np0005548918 cloud-init[1372]: -----END SSH HOST KEY FINGERPRINTS-----
Dec  6 04:00:36 np0005548918 cloud-init[1375]: #############################################################
Dec  6 04:00:36 np0005548918 cloud-init[1334]: Cloud-init v. 24.4-7.el9 finished at Sat, 06 Dec 2025 09:00:36 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.26 seconds
Dec  6 04:00:36 np0005548918 systemd[1]: Finished Cloud-init: Final Stage.
Dec  6 04:00:36 np0005548918 systemd[1]: Reached target Cloud-init target.
Dec  6 04:00:36 np0005548918 dracut[1291]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec  6 04:00:36 np0005548918 dracut[1291]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec  6 04:00:36 np0005548918 dracut[1291]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec  6 04:00:36 np0005548918 dracut[1291]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  6 04:00:36 np0005548918 dracut[1291]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  6 04:00:36 np0005548918 dracut[1291]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  6 04:00:36 np0005548918 dracut[1291]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  6 04:00:36 np0005548918 dracut[1291]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  6 04:00:36 np0005548918 dracut[1291]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  6 04:00:36 np0005548918 dracut[1291]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  6 04:00:36 np0005548918 dracut[1291]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  6 04:00:36 np0005548918 dracut[1291]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  6 04:00:36 np0005548918 dracut[1291]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  6 04:00:36 np0005548918 dracut[1291]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: memstrack is not available
Dec  6 04:00:37 np0005548918 dracut[1291]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  6 04:00:37 np0005548918 dracut[1291]: memstrack is not available
Dec  6 04:00:37 np0005548918 dracut[1291]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  6 04:00:37 np0005548918 dracut[1291]: *** Including module: systemd ***
Dec  6 04:00:38 np0005548918 dracut[1291]: *** Including module: fips ***
Dec  6 04:00:38 np0005548918 dracut[1291]: *** Including module: systemd-initrd ***
Dec  6 04:00:38 np0005548918 dracut[1291]: *** Including module: i18n ***
Dec  6 04:00:38 np0005548918 dracut[1291]: *** Including module: drm ***
Dec  6 04:00:38 np0005548918 dracut[1291]: *** Including module: prefixdevname ***
Dec  6 04:00:38 np0005548918 dracut[1291]: *** Including module: kernel-modules ***
Dec  6 04:00:39 np0005548918 kernel: block vda: the capability attribute has been deprecated.
Dec  6 04:00:39 np0005548918 chronyd[789]: Selected source 174.142.148.226 (2.centos.pool.ntp.org)
Dec  6 04:00:39 np0005548918 chronyd[789]: System clock TAI offset set to 37 seconds
Dec  6 04:00:39 np0005548918 dracut[1291]: *** Including module: kernel-modules-extra ***
Dec  6 04:00:39 np0005548918 dracut[1291]: *** Including module: qemu ***
Dec  6 04:00:39 np0005548918 dracut[1291]: *** Including module: fstab-sys ***
Dec  6 04:00:39 np0005548918 dracut[1291]: *** Including module: rootfs-block ***
Dec  6 04:00:39 np0005548918 dracut[1291]: *** Including module: terminfo ***
Dec  6 04:00:39 np0005548918 dracut[1291]: *** Including module: udev-rules ***
Dec  6 04:00:40 np0005548918 dracut[1291]: Skipping udev rule: 91-permissions.rules
Dec  6 04:00:40 np0005548918 dracut[1291]: Skipping udev rule: 80-drivers-modprobe.rules
Dec  6 04:00:40 np0005548918 dracut[1291]: *** Including module: virtiofs ***
Dec  6 04:00:40 np0005548918 dracut[1291]: *** Including module: dracut-systemd ***
Dec  6 04:00:40 np0005548918 dracut[1291]: *** Including module: usrmount ***
Dec  6 04:00:40 np0005548918 dracut[1291]: *** Including module: base ***
Dec  6 04:00:40 np0005548918 dracut[1291]: *** Including module: fs-lib ***
Dec  6 04:00:40 np0005548918 dracut[1291]: *** Including module: kdumpbase ***
Dec  6 04:00:41 np0005548918 dracut[1291]: *** Including module: microcode_ctl-fw_dir_override ***
Dec  6 04:00:41 np0005548918 dracut[1291]:  microcode_ctl module: mangling fw_dir
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: configuration "intel" is ignored
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec  6 04:00:41 np0005548918 dracut[1291]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec  6 04:00:41 np0005548918 dracut[1291]: *** Including module: openssl ***
Dec  6 04:00:41 np0005548918 dracut[1291]: *** Including module: shutdown ***
Dec  6 04:00:41 np0005548918 dracut[1291]: *** Including module: squash ***
Dec  6 04:00:41 np0005548918 dracut[1291]: *** Including modules done ***
Dec  6 04:00:41 np0005548918 dracut[1291]: *** Installing kernel module dependencies ***
Dec  6 04:00:42 np0005548918 irqbalance[796]: Cannot change IRQ 25 affinity: Operation not permitted
Dec  6 04:00:42 np0005548918 irqbalance[796]: IRQ 25 affinity is now unmanaged
Dec  6 04:00:42 np0005548918 irqbalance[796]: Cannot change IRQ 31 affinity: Operation not permitted
Dec  6 04:00:42 np0005548918 irqbalance[796]: IRQ 31 affinity is now unmanaged
Dec  6 04:00:42 np0005548918 irqbalance[796]: Cannot change IRQ 28 affinity: Operation not permitted
Dec  6 04:00:42 np0005548918 irqbalance[796]: IRQ 28 affinity is now unmanaged
Dec  6 04:00:42 np0005548918 irqbalance[796]: Cannot change IRQ 32 affinity: Operation not permitted
Dec  6 04:00:42 np0005548918 irqbalance[796]: IRQ 32 affinity is now unmanaged
Dec  6 04:00:42 np0005548918 irqbalance[796]: Cannot change IRQ 30 affinity: Operation not permitted
Dec  6 04:00:42 np0005548918 irqbalance[796]: IRQ 30 affinity is now unmanaged
Dec  6 04:00:42 np0005548918 irqbalance[796]: Cannot change IRQ 29 affinity: Operation not permitted
Dec  6 04:00:42 np0005548918 irqbalance[796]: IRQ 29 affinity is now unmanaged
Dec  6 04:00:42 np0005548918 dracut[1291]: *** Installing kernel module dependencies done ***
Dec  6 04:00:42 np0005548918 dracut[1291]: *** Resolving executable dependencies ***
Dec  6 04:00:44 np0005548918 dracut[1291]: *** Resolving executable dependencies done ***
Dec  6 04:00:44 np0005548918 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  6 04:00:44 np0005548918 dracut[1291]: *** Generating early-microcode cpio image ***
Dec  6 04:00:44 np0005548918 dracut[1291]: *** Store current command line parameters ***
Dec  6 04:00:44 np0005548918 dracut[1291]: Stored kernel commandline:
Dec  6 04:00:44 np0005548918 dracut[1291]: No dracut internal kernel commandline stored in the initramfs
Dec  6 04:00:44 np0005548918 dracut[1291]: *** Install squash loader ***
Dec  6 04:00:45 np0005548918 dracut[1291]: *** Squashing the files inside the initramfs ***
Dec  6 04:00:46 np0005548918 dracut[1291]: *** Squashing the files inside the initramfs done ***
Dec  6 04:00:46 np0005548918 dracut[1291]: *** Creating image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' ***
Dec  6 04:00:46 np0005548918 dracut[1291]: *** Hardlinking files ***
Dec  6 04:00:46 np0005548918 dracut[1291]: *** Hardlinking files done ***
Dec  6 04:00:47 np0005548918 dracut[1291]: *** Creating initramfs image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' done ***
Dec  6 04:00:48 np0005548918 kdumpctl[1018]: kdump: kexec: loaded kdump kernel
Dec  6 04:00:48 np0005548918 kdumpctl[1018]: kdump: Starting kdump: [OK]
Dec  6 04:00:48 np0005548918 systemd[1]: Finished Crash recovery kernel arming.
Dec  6 04:00:48 np0005548918 systemd[1]: Startup finished in 1.545s (kernel) + 2.410s (initrd) + 18.594s (userspace) = 22.550s.
Dec  6 04:00:54 np0005548918 systemd[1]: Created slice User Slice of UID 1000.
Dec  6 04:00:54 np0005548918 systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec  6 04:00:54 np0005548918 systemd-logind[800]: New session 1 of user zuul.
Dec  6 04:00:54 np0005548918 systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec  6 04:00:54 np0005548918 systemd[1]: Starting User Manager for UID 1000...
Dec  6 04:00:54 np0005548918 systemd[4305]: Queued start job for default target Main User Target.
Dec  6 04:00:54 np0005548918 systemd[4305]: Created slice User Application Slice.
Dec  6 04:00:54 np0005548918 systemd[4305]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  6 04:00:54 np0005548918 systemd[4305]: Started Daily Cleanup of User's Temporary Directories.
Dec  6 04:00:54 np0005548918 systemd[4305]: Reached target Paths.
Dec  6 04:00:54 np0005548918 systemd[4305]: Reached target Timers.
Dec  6 04:00:54 np0005548918 systemd[4305]: Starting D-Bus User Message Bus Socket...
Dec  6 04:00:54 np0005548918 systemd[4305]: Starting Create User's Volatile Files and Directories...
Dec  6 04:00:54 np0005548918 systemd[4305]: Finished Create User's Volatile Files and Directories.
Dec  6 04:00:54 np0005548918 systemd[4305]: Listening on D-Bus User Message Bus Socket.
Dec  6 04:00:54 np0005548918 systemd[4305]: Reached target Sockets.
Dec  6 04:00:54 np0005548918 systemd[4305]: Reached target Basic System.
Dec  6 04:00:54 np0005548918 systemd[4305]: Reached target Main User Target.
Dec  6 04:00:54 np0005548918 systemd[4305]: Startup finished in 110ms.
Dec  6 04:00:54 np0005548918 systemd[1]: Started User Manager for UID 1000.
Dec  6 04:00:54 np0005548918 systemd[1]: Started Session 1 of User zuul.
Dec  6 04:00:55 np0005548918 python3[4387]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:00:59 np0005548918 python3[4415]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:01:02 np0005548918 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  6 04:01:05 np0005548918 python3[4490]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:01:06 np0005548918 python3[4530]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec  6 04:01:08 np0005548918 python3[4556]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDU0JPqo3RlcbkISWeWyZyh8N1DipPCXKbgbj83sLrBXd5pRLoLdbqBjiuLvFfP7lb5gET6+eP3VZiOMI6UHmEm8ynKQRTIQ7lxC6wlJ/5bEkQ7shEony5Dt8S+/YriKnW8SR/bfYJwGVDGiYwX9+YLTEkgtaWYCW5aOhF1JYR2fNVZQyTaBuiZFc/j1+ce31wCfSAIAFETx4TP71KVZET/mDhOPfYQSE6dNJCcZnohKVSa1SHNL0bVxbehOrQrmqmiRc81piGO4LAMvuSM3op7QTjc7lDDNoYX/DWm/O6Yd8IV5PAI5jAYm4zViXyj8K/iPfclSAUCutpd/HwsQjjiI9Ei0ObVrpLhV3PWw6UkMmfRl4sN90Bhg/95I6taoeEDSSNojukndyGr3lxM1SkEHO0ZamuvQmAOsP05x89hsZFP9E+RntviBPqrCNyyiE7JEy2H1WfIK5i0KA/BC8M+osytKOc1zBu/jI4TYPr32yUNd7mIBDzpNaUok32L4Pk= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:09 np0005548918 python3[4580]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:09 np0005548918 python3[4679]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:01:09 np0005548918 python3[4750]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765011669.4211884-253-37663668821651/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=66d341c321a043af9793d30ca9726f09_id_rsa follow=False checksum=1c48fa8bdbec038bf9f0f4b497dca115d790ad66 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:10 np0005548918 python3[4873]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:01:10 np0005548918 python3[4944]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765011670.2944748-309-107432664350252/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=66d341c321a043af9793d30ca9726f09_id_rsa.pub follow=False checksum=e7cbe2647d02b25f8aa52dd3d3a0ea1aa1cad833 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:12 np0005548918 python3[4992]: ansible-ping Invoked with data=pong
Dec  6 04:01:13 np0005548918 python3[5016]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:01:15 np0005548918 python3[5074]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec  6 04:01:16 np0005548918 python3[5106]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:16 np0005548918 python3[5130]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:17 np0005548918 python3[5154]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:17 np0005548918 python3[5178]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:17 np0005548918 python3[5202]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:18 np0005548918 python3[5226]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:19 np0005548918 python3[5252]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:20 np0005548918 python3[5330]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:01:20 np0005548918 python3[5403]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765011680.060516-34-146758522235983/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:21 np0005548918 python3[5451]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:21 np0005548918 python3[5475]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:22 np0005548918 python3[5499]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:22 np0005548918 python3[5523]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:22 np0005548918 python3[5547]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:23 np0005548918 python3[5571]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:23 np0005548918 python3[5595]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:23 np0005548918 python3[5619]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:23 np0005548918 python3[5643]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:24 np0005548918 python3[5667]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:24 np0005548918 python3[5691]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:24 np0005548918 python3[5715]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:24 np0005548918 python3[5739]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:25 np0005548918 python3[5763]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:25 np0005548918 python3[5787]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:25 np0005548918 python3[5811]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:26 np0005548918 python3[5835]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:26 np0005548918 python3[5859]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:26 np0005548918 python3[5883]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:26 np0005548918 python3[5907]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:27 np0005548918 python3[5931]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:27 np0005548918 python3[5955]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:27 np0005548918 python3[5979]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:27 np0005548918 python3[6003]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:28 np0005548918 python3[6027]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:28 np0005548918 python3[6051]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:01:31 np0005548918 python3[6077]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  6 04:01:31 np0005548918 systemd[1]: Starting Time & Date Service...
Dec  6 04:01:31 np0005548918 systemd[1]: Started Time & Date Service.
Dec  6 04:01:31 np0005548918 systemd-timedated[6079]: Changed time zone to 'UTC' (UTC).
Dec  6 04:01:32 np0005548918 python3[6108]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:32 np0005548918 python3[6184]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:01:33 np0005548918 python3[6255]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1765011692.4291847-254-50304738896998/source _original_basename=tmpwmnnsxkd follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:33 np0005548918 python3[6355]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:01:33 np0005548918 python3[6426]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765011693.3429573-304-136572348740532/source _original_basename=tmpgsbebvpu follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:34 np0005548918 python3[6528]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:01:35 np0005548918 python3[6601]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765011694.6501129-384-193609977679218/source _original_basename=tmpjq8x3ahc follow=False checksum=6c462e10cf6b935fb22f4386c31d576dcf4d4133 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:35 np0005548918 python3[6649]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:01:36 np0005548918 python3[6675]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:01:36 np0005548918 python3[6755]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:01:36 np0005548918 python3[6828]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1765011696.3969746-454-143234131838429/source _original_basename=tmp5n_56vtp follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:37 np0005548918 python3[6879]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-c2c1-5ee8-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:01:38 np0005548918 python3[6907]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-c2c1-5ee8-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec  6 04:01:39 np0005548918 python3[6936]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:01:42 np0005548918 irqbalance[796]: Cannot change IRQ 27 affinity: Operation not permitted
Dec  6 04:01:42 np0005548918 irqbalance[796]: IRQ 27 affinity is now unmanaged
Dec  6 04:01:59 np0005548918 python3[6962]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:02:01 np0005548918 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  6 04:02:59 np0005548918 systemd-logind[800]: Session 1 logged out. Waiting for processes to exit.
Dec  6 04:03:28 np0005548918 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  6 04:03:28 np0005548918 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec  6 04:03:28 np0005548918 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec  6 04:03:28 np0005548918 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec  6 04:03:28 np0005548918 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec  6 04:03:28 np0005548918 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec  6 04:03:28 np0005548918 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec  6 04:03:28 np0005548918 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec  6 04:03:28 np0005548918 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec  6 04:03:28 np0005548918 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec  6 04:03:28 np0005548918 NetworkManager[863]: <info>  [1765011808.1787] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  6 04:03:28 np0005548918 systemd-udevd[6966]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 04:03:28 np0005548918 NetworkManager[863]: <info>  [1765011808.1936] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:03:28 np0005548918 NetworkManager[863]: <info>  [1765011808.1960] settings: (eth1): created default wired connection 'Wired connection 1'
Dec  6 04:03:28 np0005548918 NetworkManager[863]: <info>  [1765011808.1965] device (eth1): carrier: link connected
Dec  6 04:03:28 np0005548918 NetworkManager[863]: <info>  [1765011808.1967] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  6 04:03:28 np0005548918 NetworkManager[863]: <info>  [1765011808.1973] policy: auto-activating connection 'Wired connection 1' (1d8b4314-e1d6-3719-8259-1f6b640b2f44)
Dec  6 04:03:28 np0005548918 NetworkManager[863]: <info>  [1765011808.1977] device (eth1): Activation: starting connection 'Wired connection 1' (1d8b4314-e1d6-3719-8259-1f6b640b2f44)
Dec  6 04:03:28 np0005548918 NetworkManager[863]: <info>  [1765011808.1978] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:03:28 np0005548918 NetworkManager[863]: <info>  [1765011808.1982] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:03:28 np0005548918 NetworkManager[863]: <info>  [1765011808.1986] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:03:28 np0005548918 NetworkManager[863]: <info>  [1765011808.1990] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:03:28 np0005548918 systemd[4305]: Starting Mark boot as successful...
Dec  6 04:03:28 np0005548918 systemd[4305]: Finished Mark boot as successful.
Dec  6 04:03:29 np0005548918 systemd-logind[800]: New session 3 of user zuul.
Dec  6 04:03:29 np0005548918 systemd[1]: Started Session 3 of User zuul.
Dec  6 04:03:29 np0005548918 python3[6997]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-5a9f-9569-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:03:39 np0005548918 python3[7077]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:03:39 np0005548918 python3[7150]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765011819.0271869-206-122552517278128/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=cbfd06489abc1bfebf0cd672db3ddc8d86a3c65d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:03:40 np0005548918 python3[7200]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:03:40 np0005548918 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  6 04:03:40 np0005548918 systemd[1]: Stopped Network Manager Wait Online.
Dec  6 04:03:40 np0005548918 systemd[1]: Stopping Network Manager Wait Online...
Dec  6 04:03:40 np0005548918 systemd[1]: Stopping Network Manager...
Dec  6 04:03:40 np0005548918 NetworkManager[863]: <info>  [1765011820.1210] caught SIGTERM, shutting down normally.
Dec  6 04:03:40 np0005548918 NetworkManager[863]: <info>  [1765011820.1217] dhcp4 (eth0): canceled DHCP transaction
Dec  6 04:03:40 np0005548918 NetworkManager[863]: <info>  [1765011820.1217] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:03:40 np0005548918 NetworkManager[863]: <info>  [1765011820.1217] dhcp4 (eth0): state changed no lease
Dec  6 04:03:40 np0005548918 NetworkManager[863]: <info>  [1765011820.1219] manager: NetworkManager state is now CONNECTING
Dec  6 04:03:40 np0005548918 NetworkManager[863]: <info>  [1765011820.1382] dhcp4 (eth1): canceled DHCP transaction
Dec  6 04:03:40 np0005548918 NetworkManager[863]: <info>  [1765011820.1382] dhcp4 (eth1): state changed no lease
Dec  6 04:03:40 np0005548918 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  6 04:03:40 np0005548918 NetworkManager[863]: <info>  [1765011820.1455] exiting (success)
Dec  6 04:03:40 np0005548918 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  6 04:03:40 np0005548918 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  6 04:03:40 np0005548918 systemd[1]: Stopped Network Manager.
Dec  6 04:03:40 np0005548918 systemd[1]: NetworkManager.service: Consumed 1.085s CPU time, 10.0M memory peak.
Dec  6 04:03:40 np0005548918 systemd[1]: Starting Network Manager...
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2004] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:6ee0f712-c10e-4848-9a0e-9942073e400e)
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2006] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2057] manager[0x559ab6548070]: monitoring kernel firmware directory '/lib/firmware'.
Dec  6 04:03:40 np0005548918 systemd[1]: Starting Hostname Service...
Dec  6 04:03:40 np0005548918 systemd[1]: Started Hostname Service.
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2800] hostname: hostname: using hostnamed
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2801] hostname: static hostname changed from (none) to "np0005548918.novalocal"
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2805] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2809] manager[0x559ab6548070]: rfkill: Wi-Fi hardware radio set enabled
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2809] manager[0x559ab6548070]: rfkill: WWAN hardware radio set enabled
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2831] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2831] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2832] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2832] manager: Networking is enabled by state file
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2834] settings: Loaded settings plugin: keyfile (internal)
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2837] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2859] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2868] dhcp: init: Using DHCP client 'internal'
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2870] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2876] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2882] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2888] device (lo): Activation: starting connection 'lo' (ff720ffb-c083-491a-b3ff-e737ba278b15)
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2894] device (eth0): carrier: link connected
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2897] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2901] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2902] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2907] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2913] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2918] device (eth1): carrier: link connected
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2921] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2926] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (1d8b4314-e1d6-3719-8259-1f6b640b2f44) (indicated)
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2926] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2931] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2937] device (eth1): Activation: starting connection 'Wired connection 1' (1d8b4314-e1d6-3719-8259-1f6b640b2f44)
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2942] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  6 04:03:40 np0005548918 systemd[1]: Started Network Manager.
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2954] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2957] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2959] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2962] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2964] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2985] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2988] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2991] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.2998] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.3002] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.3011] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.3014] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.3030] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.3038] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.3045] device (lo): Activation: successful, device activated.
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.3053] dhcp4 (eth0): state changed new lease, address=38.102.83.74
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.3062] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  6 04:03:40 np0005548918 systemd[1]: Starting Network Manager Wait Online...
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.3138] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.3154] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.3155] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.3159] manager: NetworkManager state is now CONNECTED_SITE
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.3164] device (eth0): Activation: successful, device activated.
Dec  6 04:03:40 np0005548918 NetworkManager[7212]: <info>  [1765011820.3169] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  6 04:03:40 np0005548918 python3[7284]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-5a9f-9569-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:03:50 np0005548918 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  6 04:04:10 np0005548918 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3409] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  6 04:04:25 np0005548918 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  6 04:04:25 np0005548918 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3653] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3656] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3669] device (eth1): Activation: successful, device activated.
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3677] manager: startup complete
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3679] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <warn>  [1765011865.3695] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3702] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec  6 04:04:25 np0005548918 systemd[1]: Finished Network Manager Wait Online.
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3863] dhcp4 (eth1): canceled DHCP transaction
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3864] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3864] dhcp4 (eth1): state changed no lease
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3876] policy: auto-activating connection 'ci-private-network' (6263b1b4-906c-58bf-9646-405c09d409a3)
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3880] device (eth1): Activation: starting connection 'ci-private-network' (6263b1b4-906c-58bf-9646-405c09d409a3)
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3881] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3885] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3890] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3896] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3934] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3935] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:04:25 np0005548918 NetworkManager[7212]: <info>  [1765011865.3940] device (eth1): Activation: successful, device activated.
Dec  6 04:04:35 np0005548918 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  6 04:04:40 np0005548918 systemd-logind[800]: Session 3 logged out. Waiting for processes to exit.
Dec  6 04:04:40 np0005548918 systemd[1]: session-3.scope: Deactivated successfully.
Dec  6 04:04:40 np0005548918 systemd[1]: session-3.scope: Consumed 1.434s CPU time.
Dec  6 04:04:40 np0005548918 systemd-logind[800]: Removed session 3.
Dec  6 04:04:52 np0005548918 systemd-logind[800]: New session 4 of user zuul.
Dec  6 04:04:52 np0005548918 systemd[1]: Started Session 4 of User zuul.
Dec  6 04:04:53 np0005548918 python3[7393]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:04:53 np0005548918 python3[7466]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765011893.0947993-373-122204550877007/source _original_basename=tmpn3o7dte5 follow=False checksum=81d87914000d1f03e4ba3a0a6e4eda468c65f433 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:04:55 np0005548918 systemd[1]: session-4.scope: Deactivated successfully.
Dec  6 04:04:55 np0005548918 systemd-logind[800]: Session 4 logged out. Waiting for processes to exit.
Dec  6 04:04:55 np0005548918 systemd-logind[800]: Removed session 4.
Dec  6 04:06:46 np0005548918 systemd[4305]: Created slice User Background Tasks Slice.
Dec  6 04:06:46 np0005548918 systemd[4305]: Starting Cleanup of User's Temporary Files and Directories...
Dec  6 04:06:46 np0005548918 systemd[4305]: Finished Cleanup of User's Temporary Files and Directories.
Dec  6 04:10:24 np0005548918 systemd-logind[800]: New session 5 of user zuul.
Dec  6 04:10:24 np0005548918 systemd[1]: Started Session 5 of User zuul.
Dec  6 04:10:24 np0005548918 python3[7526]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-6aeb-b52e-000000001cd4-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:10:25 np0005548918 python3[7554]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:10:25 np0005548918 python3[7581]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:10:25 np0005548918 python3[7607]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:10:25 np0005548918 python3[7633]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:10:26 np0005548918 python3[7659]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:10:27 np0005548918 python3[7737]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:10:27 np0005548918 python3[7810]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765012227.178113-519-206429644954335/source _original_basename=tmpms3skqx5 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:10:29 np0005548918 python3[7860]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:10:29 np0005548918 systemd[1]: Reloading.
Dec  6 04:10:29 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:10:30 np0005548918 python3[7917]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec  6 04:10:31 np0005548918 python3[7943]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:10:31 np0005548918 python3[7971]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:10:31 np0005548918 python3[7999]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:10:32 np0005548918 python3[8027]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:10:32 np0005548918 irqbalance[796]: Cannot change IRQ 26 affinity: Operation not permitted
Dec  6 04:10:32 np0005548918 irqbalance[796]: IRQ 26 affinity is now unmanaged
Dec  6 04:10:33 np0005548918 python3[8054]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-6aeb-b52e-000000001cdb-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:10:33 np0005548918 python3[8084]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  6 04:10:36 np0005548918 systemd-logind[800]: Session 5 logged out. Waiting for processes to exit.
Dec  6 04:10:36 np0005548918 systemd[1]: session-5.scope: Deactivated successfully.
Dec  6 04:10:36 np0005548918 systemd[1]: session-5.scope: Consumed 3.833s CPU time.
Dec  6 04:10:36 np0005548918 systemd-logind[800]: Removed session 5.
Dec  6 04:10:38 np0005548918 systemd-logind[800]: New session 6 of user zuul.
Dec  6 04:10:38 np0005548918 systemd[1]: Started Session 6 of User zuul.
Dec  6 04:10:38 np0005548918 python3[8118]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  6 04:10:54 np0005548918 kernel: SELinux:  Converting 384 SID table entries...
Dec  6 04:10:54 np0005548918 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:10:54 np0005548918 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:10:54 np0005548918 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:10:54 np0005548918 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:10:54 np0005548918 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:10:54 np0005548918 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:10:54 np0005548918 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:11:02 np0005548918 kernel: SELinux:  Converting 384 SID table entries...
Dec  6 04:11:02 np0005548918 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:11:02 np0005548918 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:11:02 np0005548918 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:11:02 np0005548918 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:11:02 np0005548918 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:11:02 np0005548918 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:11:02 np0005548918 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:11:13 np0005548918 kernel: SELinux:  Converting 384 SID table entries...
Dec  6 04:11:13 np0005548918 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:11:13 np0005548918 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:11:13 np0005548918 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:11:13 np0005548918 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:11:13 np0005548918 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:11:13 np0005548918 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:11:13 np0005548918 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:11:14 np0005548918 setsebool[8184]: The virt_use_nfs policy boolean was changed to 1 by root
Dec  6 04:11:14 np0005548918 setsebool[8184]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec  6 04:11:25 np0005548918 kernel: SELinux:  Converting 387 SID table entries...
Dec  6 04:11:25 np0005548918 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:11:25 np0005548918 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:11:25 np0005548918 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:11:25 np0005548918 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:11:25 np0005548918 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:11:25 np0005548918 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:11:25 np0005548918 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:11:43 np0005548918 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  6 04:11:43 np0005548918 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:11:43 np0005548918 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:11:43 np0005548918 systemd[1]: Reloading.
Dec  6 04:11:43 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:11:43 np0005548918 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:11:47 np0005548918 python3[13025]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-d561-0a5b-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:11:48 np0005548918 kernel: evm: overlay not supported
Dec  6 04:11:48 np0005548918 systemd[4305]: Starting D-Bus User Message Bus...
Dec  6 04:11:48 np0005548918 dbus-broker-launch[13981]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec  6 04:11:48 np0005548918 dbus-broker-launch[13981]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec  6 04:11:48 np0005548918 systemd[4305]: Started D-Bus User Message Bus.
Dec  6 04:11:48 np0005548918 dbus-broker-lau[13981]: Ready
Dec  6 04:11:48 np0005548918 systemd[4305]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  6 04:11:48 np0005548918 systemd[4305]: Created slice Slice /user.
Dec  6 04:11:48 np0005548918 systemd[4305]: podman-13910.scope: unit configures an IP firewall, but not running as root.
Dec  6 04:11:48 np0005548918 systemd[4305]: (This warning is only shown for the first unit using IP firewalling.)
Dec  6 04:11:48 np0005548918 systemd[4305]: Started podman-13910.scope.
Dec  6 04:11:48 np0005548918 systemd[4305]: Started podman-pause-96b625c5.scope.
Dec  6 04:11:49 np0005548918 systemd[1]: session-6.scope: Deactivated successfully.
Dec  6 04:11:49 np0005548918 systemd[1]: session-6.scope: Consumed 1min 299ms CPU time.
Dec  6 04:11:49 np0005548918 systemd-logind[800]: Session 6 logged out. Waiting for processes to exit.
Dec  6 04:11:49 np0005548918 systemd-logind[800]: Removed session 6.
Dec  6 04:12:18 np0005548918 systemd-logind[800]: New session 7 of user zuul.
Dec  6 04:12:18 np0005548918 systemd[1]: Started Session 7 of User zuul.
Dec  6 04:12:18 np0005548918 python3[27282]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK/b/hDus+zgErbxpiAu4axJ55LMjNixMhoE4DoEU6Wq/xn30MdVWwMPMhgQamY6n3JqihnzwOz1OzKhBTCdzls= zuul@np0005548914.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:12:19 np0005548918 python3[27512]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK/b/hDus+zgErbxpiAu4axJ55LMjNixMhoE4DoEU6Wq/xn30MdVWwMPMhgQamY6n3JqihnzwOz1OzKhBTCdzls= zuul@np0005548914.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:12:20 np0005548918 python3[27975]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548918.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec  6 04:12:20 np0005548918 python3[28252]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK/b/hDus+zgErbxpiAu4axJ55LMjNixMhoE4DoEU6Wq/xn30MdVWwMPMhgQamY6n3JqihnzwOz1OzKhBTCdzls= zuul@np0005548914.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  6 04:12:21 np0005548918 python3[28569]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:12:21 np0005548918 python3[28870]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765012340.8513274-155-124443819453378/source _original_basename=tmppqkc3rnx follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:12:22 np0005548918 python3[29270]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Dec  6 04:12:22 np0005548918 systemd[1]: Starting Hostname Service...
Dec  6 04:12:22 np0005548918 systemd[1]: Started Hostname Service.
Dec  6 04:12:22 np0005548918 systemd-hostnamed[29281]: Changed pretty hostname to 'compute-2'
Dec  6 04:12:22 np0005548918 systemd-hostnamed[29281]: Hostname set to <compute-2> (static)
Dec  6 04:12:22 np0005548918 NetworkManager[7212]: <info>  [1765012342.6588] hostname: static hostname changed from "np0005548918.novalocal" to "compute-2"
Dec  6 04:12:22 np0005548918 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  6 04:12:22 np0005548918 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  6 04:12:23 np0005548918 systemd[1]: session-7.scope: Deactivated successfully.
Dec  6 04:12:23 np0005548918 systemd[1]: session-7.scope: Consumed 2.115s CPU time.
Dec  6 04:12:23 np0005548918 systemd-logind[800]: Session 7 logged out. Waiting for processes to exit.
Dec  6 04:12:23 np0005548918 systemd-logind[800]: Removed session 7.
Dec  6 04:12:24 np0005548918 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:12:24 np0005548918 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:12:24 np0005548918 systemd[1]: man-db-cache-update.service: Consumed 49.740s CPU time.
Dec  6 04:12:24 np0005548918 systemd[1]: run-rbd8bdc6f99704f81b7e23207e046a3c6.service: Deactivated successfully.
Dec  6 04:12:32 np0005548918 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  6 04:12:52 np0005548918 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  6 04:15:46 np0005548918 systemd[1]: Starting Cleanup of Temporary Directories...
Dec  6 04:15:46 np0005548918 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec  6 04:15:46 np0005548918 systemd[1]: Finished Cleanup of Temporary Directories.
Dec  6 04:15:46 np0005548918 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec  6 04:16:07 np0005548918 systemd-logind[800]: New session 8 of user zuul.
Dec  6 04:16:07 np0005548918 systemd[1]: Started Session 8 of User zuul.
Dec  6 04:16:07 np0005548918 python3[29996]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:16:10 np0005548918 python3[30112]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:16:10 np0005548918 python3[30185]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.1984386-33927-279051227357217/source mode=0755 _original_basename=delorean.repo follow=False checksum=39c885eb875fd03e010d1b0454241c26b121dfb2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:16:11 np0005548918 python3[30211]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:16:11 np0005548918 python3[30284]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.1984386-33927-279051227357217/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:16:11 np0005548918 python3[30310]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:16:11 np0005548918 python3[30383]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.1984386-33927-279051227357217/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:16:12 np0005548918 python3[30409]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:16:12 np0005548918 python3[30482]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.1984386-33927-279051227357217/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:16:12 np0005548918 python3[30508]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:16:13 np0005548918 python3[30581]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.1984386-33927-279051227357217/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:16:13 np0005548918 python3[30607]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:16:13 np0005548918 python3[30680]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.1984386-33927-279051227357217/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:16:13 np0005548918 python3[30706]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:16:14 np0005548918 python3[30779]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.1984386-33927-279051227357217/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6e18e2038d54303b4926db53c0b6cced515a9151 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:16:27 np0005548918 python3[30827]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:21:26 np0005548918 systemd-logind[800]: Session 8 logged out. Waiting for processes to exit.
Dec  6 04:21:26 np0005548918 systemd[1]: session-8.scope: Deactivated successfully.
Dec  6 04:21:26 np0005548918 systemd[1]: session-8.scope: Consumed 4.650s CPU time.
Dec  6 04:21:26 np0005548918 systemd-logind[800]: Removed session 8.
Dec  6 04:27:56 np0005548918 systemd-logind[800]: New session 9 of user zuul.
Dec  6 04:27:56 np0005548918 systemd[1]: Started Session 9 of User zuul.
Dec  6 04:27:57 np0005548918 python3.9[30987]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:27:58 np0005548918 python3.9[31168]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:28:06 np0005548918 systemd[1]: session-9.scope: Deactivated successfully.
Dec  6 04:28:06 np0005548918 systemd[1]: session-9.scope: Consumed 7.308s CPU time.
Dec  6 04:28:06 np0005548918 systemd-logind[800]: Session 9 logged out. Waiting for processes to exit.
Dec  6 04:28:06 np0005548918 systemd-logind[800]: Removed session 9.
Dec  6 04:28:22 np0005548918 systemd-logind[800]: New session 10 of user zuul.
Dec  6 04:28:22 np0005548918 systemd[1]: Started Session 10 of User zuul.
Dec  6 04:28:22 np0005548918 python3.9[31378]: ansible-ansible.legacy.ping Invoked with data=pong
Dec  6 04:28:24 np0005548918 python3.9[31552]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:28:25 np0005548918 python3.9[31704]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:28:26 np0005548918 python3.9[31857]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:28:26 np0005548918 python3.9[32009]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:28:27 np0005548918 python3.9[32161]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:28:28 np0005548918 python3.9[32284]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013307.3450994-179-136352138967833/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:28:29 np0005548918 python3.9[32436]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:28:30 np0005548918 python3.9[32592]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:28:31 np0005548918 python3.9[32744]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:28:32 np0005548918 python3.9[32894]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:28:38 np0005548918 python3.9[33147]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:28:40 np0005548918 python3.9[33297]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:28:41 np0005548918 python3.9[33451]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:28:42 np0005548918 python3.9[33609]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:28:43 np0005548918 python3.9[33693]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:29:27 np0005548918 systemd[1]: Reloading.
Dec  6 04:29:27 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:29:27 np0005548918 systemd[1]: Starting dnf makecache...
Dec  6 04:29:27 np0005548918 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec  6 04:29:27 np0005548918 dnf[33900]: Failed determining last makecache time.
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-openstack-barbican-42b4c41831408a8e323 143 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 systemd[1]: Reloading.
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 185 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-openstack-cinder-1c00d6490d88e436f26ef 182 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-python-stevedore-c4acc5639fd2329372142 179 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-python-cloudkitty-tests-tempest-2c80f8 165 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-os-net-config-d0cedbdb788d43e5c7551df5 161 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 186 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-python-designate-tests-tempest-347fdbc 186 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-openstack-glance-1fd12c29b339f30fe823e 192 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 179 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-openstack-manila-3c01b7181572c95dac462 174 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-python-whitebox-neutron-tests-tempest- 168 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-openstack-octavia-ba397f07a7331190208c 177 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-openstack-watcher-c014f81a8647287f6dcc 142 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-ansible-config_template-5ccaa22121a7ff 162 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 162 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 systemd[1]: Reloading.
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-openstack-swift-dc98a8463506ac520c469a 156 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-python-tempestconf-8515371b7cceebd4282 170 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 dnf[33900]: delorean-openstack-heat-ui-013accbfd179753bc3f0 180 kB/s | 3.0 kB     00:00
Dec  6 04:29:27 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:29:27 np0005548918 dnf[33900]: CentOS Stream 9 - BaseOS                         75 kB/s | 7.3 kB     00:00
Dec  6 04:29:27 np0005548918 systemd[1]: Listening on LVM2 poll daemon socket.
Dec  6 04:29:28 np0005548918 dnf[33900]: CentOS Stream 9 - AppStream                      66 kB/s | 7.4 kB     00:00
Dec  6 04:29:28 np0005548918 dbus-broker-launch[747]: Noticed file-system modification, trigger reload.
Dec  6 04:29:28 np0005548918 dbus-broker-launch[747]: Noticed file-system modification, trigger reload.
Dec  6 04:29:28 np0005548918 dbus-broker-launch[747]: Noticed file-system modification, trigger reload.
Dec  6 04:29:28 np0005548918 dnf[33900]: CentOS Stream 9 - CRB                            30 kB/s | 7.2 kB     00:00
Dec  6 04:29:28 np0005548918 dnf[33900]: CentOS Stream 9 - Extras packages                65 kB/s | 8.3 kB     00:00
Dec  6 04:29:28 np0005548918 dnf[33900]: dlrn-antelope-testing                            97 kB/s | 3.0 kB     00:00
Dec  6 04:29:28 np0005548918 dnf[33900]: dlrn-antelope-build-deps                        100 kB/s | 3.0 kB     00:00
Dec  6 04:29:28 np0005548918 dnf[33900]: centos9-rabbitmq                                 82 kB/s | 3.0 kB     00:00
Dec  6 04:29:28 np0005548918 dnf[33900]: centos9-storage                                 125 kB/s | 3.0 kB     00:00
Dec  6 04:29:28 np0005548918 dnf[33900]: centos9-opstools                                122 kB/s | 3.0 kB     00:00
Dec  6 04:29:28 np0005548918 dnf[33900]: NFV SIG OpenvSwitch                             113 kB/s | 3.0 kB     00:00
Dec  6 04:29:28 np0005548918 dnf[33900]: repo-setup-centos-appstream                     196 kB/s | 4.4 kB     00:00
Dec  6 04:29:29 np0005548918 dnf[33900]: repo-setup-centos-baseos                        154 kB/s | 3.9 kB     00:00
Dec  6 04:29:29 np0005548918 dnf[33900]: repo-setup-centos-highavailability              159 kB/s | 3.9 kB     00:00
Dec  6 04:29:29 np0005548918 dnf[33900]: repo-setup-centos-powertools                    190 kB/s | 4.3 kB     00:00
Dec  6 04:29:29 np0005548918 dnf[33900]: Extra Packages for Enterprise Linux 9 - x86_64  232 kB/s |  32 kB     00:00
Dec  6 04:29:29 np0005548918 dnf[33900]: Metadata cache created.
Dec  6 04:29:29 np0005548918 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec  6 04:29:29 np0005548918 systemd[1]: Finished dnf makecache.
Dec  6 04:29:29 np0005548918 systemd[1]: dnf-makecache.service: Consumed 1.930s CPU time.
Dec  6 04:30:31 np0005548918 kernel: SELinux:  Converting 2715 SID table entries...
Dec  6 04:30:31 np0005548918 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:30:31 np0005548918 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:30:31 np0005548918 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:30:31 np0005548918 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:30:31 np0005548918 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:30:31 np0005548918 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:30:31 np0005548918 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:30:31 np0005548918 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec  6 04:30:32 np0005548918 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:30:32 np0005548918 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:30:32 np0005548918 systemd[1]: Reloading.
Dec  6 04:30:32 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:30:32 np0005548918 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:30:33 np0005548918 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:30:33 np0005548918 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:30:33 np0005548918 systemd[1]: man-db-cache-update.service: Consumed 1.456s CPU time.
Dec  6 04:30:33 np0005548918 systemd[1]: run-r2c7963c06d4b4659a3eab6362f3d6c27.service: Deactivated successfully.
Dec  6 04:30:45 np0005548918 python3.9[35261]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:30:48 np0005548918 python3.9[35542]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec  6 04:30:49 np0005548918 python3.9[35694]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec  6 04:30:54 np0005548918 python3.9[35848]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:30:58 np0005548918 python3.9[36000]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec  6 04:31:00 np0005548918 python3.9[36153]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:31:09 np0005548918 python3.9[36305]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:31:09 np0005548918 python3.9[36428]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013464.743877-669-260048599811783/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:31:11 np0005548918 python3.9[36580]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:31:11 np0005548918 python3.9[36732]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:31:12 np0005548918 python3.9[36885]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:31:14 np0005548918 python3.9[37037]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec  6 04:31:15 np0005548918 python3.9[37190]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  6 04:31:15 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:31:15 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:31:16 np0005548918 python3.9[37349]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  6 04:31:17 np0005548918 python3.9[37509]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec  6 04:31:18 np0005548918 python3.9[37662]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  6 04:31:18 np0005548918 python3.9[37820]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec  6 04:31:20 np0005548918 python3.9[37972]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:31:23 np0005548918 python3.9[38125]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:31:24 np0005548918 python3.9[38277]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:31:24 np0005548918 python3.9[38400]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013483.6387963-1026-173405626928138/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:31:25 np0005548918 python3.9[38552]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:31:26 np0005548918 systemd[1]: Starting Load Kernel Modules...
Dec  6 04:31:26 np0005548918 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec  6 04:31:26 np0005548918 kernel: Bridge firewalling registered
Dec  6 04:31:26 np0005548918 systemd-modules-load[38556]: Inserted module 'br_netfilter'
Dec  6 04:31:26 np0005548918 systemd[1]: Finished Load Kernel Modules.
Dec  6 04:31:26 np0005548918 python3.9[38713]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:31:27 np0005548918 python3.9[38836]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013486.4477086-1095-165518832017560/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:31:29 np0005548918 python3.9[38988]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:31:32 np0005548918 dbus-broker-launch[747]: Noticed file-system modification, trigger reload.
Dec  6 04:31:32 np0005548918 dbus-broker-launch[747]: Noticed file-system modification, trigger reload.
Dec  6 04:31:33 np0005548918 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:31:33 np0005548918 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:31:33 np0005548918 systemd[1]: Reloading.
Dec  6 04:31:33 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:31:33 np0005548918 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:31:37 np0005548918 python3.9[42304]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:31:37 np0005548918 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:31:37 np0005548918 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:31:37 np0005548918 systemd[1]: man-db-cache-update.service: Consumed 5.942s CPU time.
Dec  6 04:31:37 np0005548918 systemd[1]: run-r583c0c003fb840e1b31d03f769f8abf6.service: Deactivated successfully.
Dec  6 04:31:38 np0005548918 python3.9[42854]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec  6 04:31:38 np0005548918 python3.9[43004]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:31:40 np0005548918 python3.9[43156]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:31:40 np0005548918 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  6 04:31:40 np0005548918 systemd[1]: Starting Authorization Manager...
Dec  6 04:31:40 np0005548918 polkitd[43373]: Started polkitd version 0.117
Dec  6 04:31:40 np0005548918 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  6 04:31:40 np0005548918 systemd[1]: Started Authorization Manager.
Dec  6 04:31:41 np0005548918 python3.9[43543]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:31:41 np0005548918 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec  6 04:31:42 np0005548918 systemd[1]: tuned.service: Deactivated successfully.
Dec  6 04:31:42 np0005548918 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec  6 04:31:42 np0005548918 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  6 04:31:42 np0005548918 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  6 04:31:43 np0005548918 python3.9[43705]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec  6 04:31:46 np0005548918 python3.9[43857]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:31:46 np0005548918 systemd[1]: Reloading.
Dec  6 04:31:47 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:31:48 np0005548918 python3.9[44048]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:31:48 np0005548918 systemd[1]: Reloading.
Dec  6 04:31:48 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:31:49 np0005548918 python3.9[44237]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:31:50 np0005548918 python3.9[44390]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:31:50 np0005548918 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec  6 04:31:50 np0005548918 python3.9[44543]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:31:53 np0005548918 python3.9[44705]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:31:54 np0005548918 python3.9[44858]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:31:54 np0005548918 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  6 04:31:54 np0005548918 systemd[1]: Stopped Apply Kernel Variables.
Dec  6 04:31:54 np0005548918 systemd[1]: Stopping Apply Kernel Variables...
Dec  6 04:31:54 np0005548918 systemd[1]: Starting Apply Kernel Variables...
Dec  6 04:31:54 np0005548918 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  6 04:31:54 np0005548918 systemd[1]: Finished Apply Kernel Variables.
Dec  6 04:31:54 np0005548918 systemd[1]: session-10.scope: Deactivated successfully.
Dec  6 04:31:54 np0005548918 systemd[1]: session-10.scope: Consumed 2min 19.266s CPU time.
Dec  6 04:31:54 np0005548918 systemd-logind[800]: Session 10 logged out. Waiting for processes to exit.
Dec  6 04:31:54 np0005548918 systemd-logind[800]: Removed session 10.
Dec  6 04:32:00 np0005548918 systemd-logind[800]: New session 11 of user zuul.
Dec  6 04:32:00 np0005548918 systemd[1]: Started Session 11 of User zuul.
Dec  6 04:32:01 np0005548918 python3.9[45041]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:32:02 np0005548918 python3.9[45197]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec  6 04:32:03 np0005548918 python3.9[45350]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  6 04:32:05 np0005548918 python3.9[45508]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  6 04:32:06 np0005548918 python3.9[45668]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:32:07 np0005548918 python3.9[45752]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  6 04:32:14 np0005548918 python3.9[45916]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:32:26 np0005548918 kernel: SELinux:  Converting 2728 SID table entries...
Dec  6 04:32:26 np0005548918 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:32:26 np0005548918 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:32:26 np0005548918 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:32:26 np0005548918 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:32:26 np0005548918 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:32:26 np0005548918 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:32:26 np0005548918 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:32:26 np0005548918 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec  6 04:32:26 np0005548918 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec  6 04:32:28 np0005548918 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:32:28 np0005548918 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:32:28 np0005548918 systemd[1]: Reloading.
Dec  6 04:32:28 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:32:28 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:32:28 np0005548918 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:32:29 np0005548918 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:32:29 np0005548918 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:32:29 np0005548918 systemd[1]: man-db-cache-update.service: Consumed 1.160s CPU time.
Dec  6 04:32:29 np0005548918 systemd[1]: run-r757c97068b3d45528870da2dea76835d.service: Deactivated successfully.
Dec  6 04:32:32 np0005548918 python3.9[47015]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:32:33 np0005548918 systemd[1]: Reloading.
Dec  6 04:32:33 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:32:33 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:32:33 np0005548918 systemd[1]: Starting Open vSwitch Database Unit...
Dec  6 04:32:33 np0005548918 chown[47057]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec  6 04:32:33 np0005548918 ovs-ctl[47062]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec  6 04:32:33 np0005548918 ovs-ctl[47062]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec  6 04:32:33 np0005548918 ovs-ctl[47062]: Starting ovsdb-server [  OK  ]
Dec  6 04:32:33 np0005548918 ovs-vsctl[47111]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec  6 04:32:33 np0005548918 ovs-vsctl[47127]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"1b31b208-e0d4-490d-9f30-552f5575d012\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec  6 04:32:33 np0005548918 ovs-ctl[47062]: Configuring Open vSwitch system IDs [  OK  ]
Dec  6 04:32:33 np0005548918 ovs-ctl[47062]: Enabling remote OVSDB managers [  OK  ]
Dec  6 04:32:33 np0005548918 systemd[1]: Started Open vSwitch Database Unit.
Dec  6 04:32:33 np0005548918 ovs-vsctl[47136]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Dec  6 04:32:33 np0005548918 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec  6 04:32:33 np0005548918 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec  6 04:32:33 np0005548918 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec  6 04:32:34 np0005548918 kernel: openvswitch: Open vSwitch switching datapath
Dec  6 04:32:34 np0005548918 ovs-ctl[47181]: Inserting openvswitch module [  OK  ]
Dec  6 04:32:34 np0005548918 ovs-ctl[47150]: Starting ovs-vswitchd [  OK  ]
Dec  6 04:32:34 np0005548918 ovs-vsctl[47198]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Dec  6 04:32:34 np0005548918 ovs-ctl[47150]: Enabling remote OVSDB managers [  OK  ]
Dec  6 04:32:34 np0005548918 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec  6 04:32:34 np0005548918 systemd[1]: Starting Open vSwitch...
Dec  6 04:32:34 np0005548918 systemd[1]: Finished Open vSwitch.
Dec  6 04:32:36 np0005548918 python3.9[47350]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:32:37 np0005548918 python3.9[47502]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec  6 04:32:38 np0005548918 kernel: SELinux:  Converting 2742 SID table entries...
Dec  6 04:32:38 np0005548918 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:32:38 np0005548918 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:32:38 np0005548918 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:32:38 np0005548918 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:32:38 np0005548918 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:32:38 np0005548918 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:32:38 np0005548918 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:32:39 np0005548918 python3.9[47657]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:32:40 np0005548918 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec  6 04:32:41 np0005548918 python3.9[47815]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:32:43 np0005548918 python3.9[47968]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:32:45 np0005548918 python3.9[48255]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  6 04:32:46 np0005548918 python3.9[48405]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:32:47 np0005548918 python3.9[48559]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:32:49 np0005548918 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:32:49 np0005548918 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:32:49 np0005548918 systemd[1]: Reloading.
Dec  6 04:32:49 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:32:49 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:32:49 np0005548918 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:32:49 np0005548918 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:32:49 np0005548918 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:32:49 np0005548918 systemd[1]: run-r8ae01a893a8c4806869051cffceaa4cc.service: Deactivated successfully.
Dec  6 04:32:51 np0005548918 python3.9[48875]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:32:51 np0005548918 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  6 04:32:51 np0005548918 systemd[1]: Stopped Network Manager Wait Online.
Dec  6 04:32:51 np0005548918 systemd[1]: Stopping Network Manager Wait Online...
Dec  6 04:32:51 np0005548918 systemd[1]: Stopping Network Manager...
Dec  6 04:32:51 np0005548918 NetworkManager[7212]: <info>  [1765013571.6499] caught SIGTERM, shutting down normally.
Dec  6 04:32:51 np0005548918 NetworkManager[7212]: <info>  [1765013571.6518] dhcp4 (eth0): canceled DHCP transaction
Dec  6 04:32:51 np0005548918 NetworkManager[7212]: <info>  [1765013571.6518] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:32:51 np0005548918 NetworkManager[7212]: <info>  [1765013571.6518] dhcp4 (eth0): state changed no lease
Dec  6 04:32:51 np0005548918 NetworkManager[7212]: <info>  [1765013571.6520] manager: NetworkManager state is now CONNECTED_SITE
Dec  6 04:32:51 np0005548918 NetworkManager[7212]: <info>  [1765013571.6583] exiting (success)
Dec  6 04:32:51 np0005548918 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  6 04:32:51 np0005548918 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  6 04:32:51 np0005548918 systemd[1]: Stopped Network Manager.
Dec  6 04:32:51 np0005548918 systemd[1]: NetworkManager.service: Consumed 9.876s CPU time, 4.1M memory peak, read 0B from disk, written 40.5K to disk.
Dec  6 04:32:51 np0005548918 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  6 04:32:51 np0005548918 systemd[1]: Starting Network Manager...
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.7215] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:6ee0f712-c10e-4848-9a0e-9942073e400e)
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.7218] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.7280] manager[0x56501f5b1090]: monitoring kernel firmware directory '/lib/firmware'.
Dec  6 04:32:51 np0005548918 systemd[1]: Starting Hostname Service...
Dec  6 04:32:51 np0005548918 systemd[1]: Started Hostname Service.
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8186] hostname: hostname: using hostnamed
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8187] hostname: static hostname changed from (none) to "compute-2"
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8197] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8205] manager[0x56501f5b1090]: rfkill: Wi-Fi hardware radio set enabled
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8206] manager[0x56501f5b1090]: rfkill: WWAN hardware radio set enabled
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8246] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8263] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8264] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8265] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8267] manager: Networking is enabled by state file
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8272] settings: Loaded settings plugin: keyfile (internal)
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8278] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8343] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8363] dhcp: init: Using DHCP client 'internal'
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8368] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8381] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8395] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8412] device (lo): Activation: starting connection 'lo' (ff720ffb-c083-491a-b3ff-e737ba278b15)
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8426] device (eth0): carrier: link connected
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8434] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8447] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8448] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8466] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8486] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8498] device (eth1): carrier: link connected
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8505] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8517] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (6263b1b4-906c-58bf-9646-405c09d409a3) (indicated)
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8518] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8532] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8549] device (eth1): Activation: starting connection 'ci-private-network' (6263b1b4-906c-58bf-9646-405c09d409a3)
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8559] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  6 04:32:51 np0005548918 systemd[1]: Started Network Manager.
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8584] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8592] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8596] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8599] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8610] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8616] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8624] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8638] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8653] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8662] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8681] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8699] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8708] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8710] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8716] device (lo): Activation: successful, device activated.
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8723] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8724] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8727] manager: NetworkManager state is now CONNECTED_LOCAL
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8730] device (eth1): Activation: successful, device activated.
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8739] dhcp4 (eth0): state changed new lease, address=38.102.83.74
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8746] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  6 04:32:51 np0005548918 systemd[1]: Starting Network Manager Wait Online...
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8830] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8860] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8862] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8866] manager: NetworkManager state is now CONNECTED_SITE
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8870] device (eth0): Activation: successful, device activated.
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8876] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  6 04:32:51 np0005548918 NetworkManager[48884]: <info>  [1765013571.8880] manager: startup complete
Dec  6 04:32:51 np0005548918 systemd[1]: Finished Network Manager Wait Online.
Dec  6 04:32:52 np0005548918 python3.9[49101]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:32:57 np0005548918 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:32:57 np0005548918 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:32:57 np0005548918 systemd[1]: Reloading.
Dec  6 04:32:57 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:32:57 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:32:57 np0005548918 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:32:58 np0005548918 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:32:58 np0005548918 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:32:58 np0005548918 systemd[1]: run-re2d191eb82114da6819cd59815e5f9fe.service: Deactivated successfully.
Dec  6 04:33:01 np0005548918 python3.9[49561]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:33:02 np0005548918 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  6 04:33:02 np0005548918 python3.9[49713]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:03 np0005548918 python3.9[49867]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:04 np0005548918 python3.9[50019]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:04 np0005548918 python3.9[50171]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:05 np0005548918 python3.9[50323]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:06 np0005548918 python3.9[50475]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:33:07 np0005548918 python3.9[50598]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013585.9347444-649-188771310159306/.source _original_basename=.3dja6yob follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:07 np0005548918 python3.9[50750]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:08 np0005548918 python3.9[50902]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec  6 04:33:09 np0005548918 python3.9[51054]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:12 np0005548918 python3.9[51481]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec  6 04:33:13 np0005548918 ansible-async_wrapper.py[51656]: Invoked with j958529382166 300 /home/zuul/.ansible/tmp/ansible-tmp-1765013592.6522748-847-44152427040607/AnsiballZ_edpm_os_net_config.py _
Dec  6 04:33:13 np0005548918 ansible-async_wrapper.py[51659]: Starting module and watcher
Dec  6 04:33:13 np0005548918 ansible-async_wrapper.py[51659]: Start watching 51660 (300)
Dec  6 04:33:13 np0005548918 ansible-async_wrapper.py[51660]: Start module (51660)
Dec  6 04:33:13 np0005548918 ansible-async_wrapper.py[51656]: Return async_wrapper task started.
Dec  6 04:33:13 np0005548918 python3.9[51661]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec  6 04:33:14 np0005548918 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec  6 04:33:14 np0005548918 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec  6 04:33:14 np0005548918 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec  6 04:33:14 np0005548918 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec  6 04:33:14 np0005548918 kernel: cfg80211: failed to load regulatory.db
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.8986] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9010] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9675] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9677] audit: op="connection-add" uuid="c28eb669-2053-4750-a7db-cd5c6c97f205" name="br-ex-br" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9702] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9703] audit: op="connection-add" uuid="29427ba3-97f4-45e4-a436-20b6b39ce228" name="br-ex-port" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9722] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9723] audit: op="connection-add" uuid="ed67a01e-764a-48c3-a872-53961351f0a8" name="eth1-port" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9739] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9740] audit: op="connection-add" uuid="3fee6815-81fb-46d6-bed9-7adf109bbba1" name="vlan20-port" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9757] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9758] audit: op="connection-add" uuid="e36b3dac-74d1-4dbb-bd9b-66e85d79540b" name="vlan21-port" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9775] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9776] audit: op="connection-add" uuid="9bc154b8-10eb-42ae-aaf2-71cd0eef4bb2" name="vlan22-port" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9791] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9792] audit: op="connection-add" uuid="4fe3f07c-d110-4aba-834a-3d94ab6e3425" name="vlan23-port" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9818] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp,ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9837] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9838] audit: op="connection-add" uuid="266ba40a-a51c-4572-ad4b-15f7497b7740" name="br-ex-if" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9872] audit: op="connection-update" uuid="6263b1b4-906c-58bf-9646-405c09d409a3" name="ci-private-network" args="ipv4.dns,ipv4.method,ipv4.never-default,ipv4.routing-rules,ipv4.addresses,ipv4.routes,ovs-interface.type,ipv6.dns,ipv6.method,ipv6.addr-gen-mode,ipv6.routing-rules,ipv6.addresses,ipv6.routes,connection.controller,connection.slave-type,connection.timestamp,connection.master,connection.port-type,ovs-external-ids.data" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9892] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9893] audit: op="connection-add" uuid="4f7f4997-43b3-49cd-865f-695a0816cb25" name="vlan20-if" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9913] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9914] audit: op="connection-add" uuid="9a219ba7-9dbb-4714-9928-36b15f9f0a0c" name="vlan21-if" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9934] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9936] audit: op="connection-add" uuid="1a73fefb-48c5-4a12-bb04-b86cfbd7ddf0" name="vlan22-if" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9956] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9958] audit: op="connection-add" uuid="be28e957-4aff-435a-b6ab-2b650e06fd00" name="vlan23-if" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9975] audit: op="connection-delete" uuid="1d8b4314-e1d6-3719-8259-1f6b640b2f44" name="Wired connection 1" pid=51662 uid=0 result="success"
Dec  6 04:33:15 np0005548918 NetworkManager[48884]: <info>  [1765013595.9993] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0003] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0007] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (c28eb669-2053-4750-a7db-cd5c6c97f205)
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0007] audit: op="connection-activate" uuid="c28eb669-2053-4750-a7db-cd5c6c97f205" name="br-ex-br" pid=51662 uid=0 result="success"
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0008] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0015] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0018] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (29427ba3-97f4-45e4-a436-20b6b39ce228)
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0020] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0025] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0028] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (ed67a01e-764a-48c3-a872-53961351f0a8)
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0030] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0036] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0039] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (3fee6815-81fb-46d6-bed9-7adf109bbba1)
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0041] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0047] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0050] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (e36b3dac-74d1-4dbb-bd9b-66e85d79540b)
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0052] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0058] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0062] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (9bc154b8-10eb-42ae-aaf2-71cd0eef4bb2)
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0063] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0069] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0073] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (4fe3f07c-d110-4aba-834a-3d94ab6e3425)
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0073] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0075] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0077] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0083] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0087] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0090] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (266ba40a-a51c-4572-ad4b-15f7497b7740)
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0091] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0094] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0096] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0097] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0098] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0108] device (eth1): disconnecting for new activation request.
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0109] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0112] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0113] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0114] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0116] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0121] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0124] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (4f7f4997-43b3-49cd-865f-695a0816cb25)
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0125] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0127] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0129] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0130] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0132] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0136] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0139] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (9a219ba7-9dbb-4714-9928-36b15f9f0a0c)
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0140] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0142] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0144] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0145] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0148] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0152] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0156] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (1a73fefb-48c5-4a12-bb04-b86cfbd7ddf0)
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0156] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0159] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0160] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0162] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0165] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0169] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0174] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (be28e957-4aff-435a-b6ab-2b650e06fd00)
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0174] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0177] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0179] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0180] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0181] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0197] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode" pid=51662 uid=0 result="success"
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0199] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0202] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0204] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0210] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0214] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0218] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0221] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0222] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0227] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0231] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0234] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0236] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 kernel: ovs-system: entered promiscuous mode
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0240] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0244] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0247] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0248] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0254] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0257] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0259] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0261] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0265] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0269] dhcp4 (eth0): canceled DHCP transaction
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0269] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0269] dhcp4 (eth0): state changed no lease
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0271] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec  6 04:33:16 np0005548918 kernel: Timeout policy base is empty
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0284] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0287] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51662 uid=0 result="fail" reason="Device is not activated"
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0293] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec  6 04:33:16 np0005548918 systemd-udevd[51668]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 04:33:16 np0005548918 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0325] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0341] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0347] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0357] device (eth1): disconnecting for new activation request.
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0358] audit: op="connection-activate" uuid="6263b1b4-906c-58bf-9646-405c09d409a3" name="ci-private-network" pid=51662 uid=0 result="success"
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0410] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51662 uid=0 result="success"
Dec  6 04:33:16 np0005548918 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0565] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0743] device (eth1): Activation: starting connection 'ci-private-network' (6263b1b4-906c-58bf-9646-405c09d409a3)
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0763] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0766] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 kernel: br-ex: entered promiscuous mode
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0774] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0775] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0776] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0777] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0778] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0779] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0780] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0787] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0793] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0797] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0803] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0806] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0809] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0812] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0816] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0819] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0823] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0826] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0832] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0835] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0839] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0843] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0849] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0854] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0927] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0931] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 kernel: vlan22: entered promiscuous mode
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0940] device (eth1): Activation: successful, device activated.
Dec  6 04:33:16 np0005548918 systemd-udevd[51666]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0964] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.0977] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 kernel: vlan23: entered promiscuous mode
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1035] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1038] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1045] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548918 systemd-udevd[51667]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 04:33:16 np0005548918 kernel: vlan20: entered promiscuous mode
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1159] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1176] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 kernel: vlan21: entered promiscuous mode
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1220] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1224] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1231] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1242] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1253] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1291] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1298] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1304] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1311] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1322] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1364] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1369] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1370] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1377] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1398] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1436] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1441] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  6 04:33:16 np0005548918 NetworkManager[48884]: <info>  [1765013596.1447] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  6 04:33:17 np0005548918 NetworkManager[48884]: <info>  [1765013597.0534] dhcp4 (eth0): state changed new lease, address=38.102.83.74
Dec  6 04:33:17 np0005548918 NetworkManager[48884]: <info>  [1765013597.2934] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51662 uid=0 result="success"
Dec  6 04:33:17 np0005548918 python3.9[52020]: ansible-ansible.legacy.async_status Invoked with jid=j958529382166.51656 mode=status _async_dir=/root/.ansible_async
Dec  6 04:33:17 np0005548918 NetworkManager[48884]: <info>  [1765013597.6099] checkpoint[0x56501f587950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec  6 04:33:17 np0005548918 NetworkManager[48884]: <info>  [1765013597.6104] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51662 uid=0 result="success"
Dec  6 04:33:18 np0005548918 NetworkManager[48884]: <info>  [1765013598.0644] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51662 uid=0 result="success"
Dec  6 04:33:18 np0005548918 NetworkManager[48884]: <info>  [1765013598.0658] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51662 uid=0 result="success"
Dec  6 04:33:18 np0005548918 NetworkManager[48884]: <info>  [1765013598.3363] audit: op="networking-control" arg="global-dns-configuration" pid=51662 uid=0 result="success"
Dec  6 04:33:18 np0005548918 NetworkManager[48884]: <info>  [1765013598.3397] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec  6 04:33:18 np0005548918 NetworkManager[48884]: <info>  [1765013598.3434] audit: op="networking-control" arg="global-dns-configuration" pid=51662 uid=0 result="success"
Dec  6 04:33:18 np0005548918 NetworkManager[48884]: <info>  [1765013598.3457] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51662 uid=0 result="success"
Dec  6 04:33:18 np0005548918 NetworkManager[48884]: <info>  [1765013598.5008] checkpoint[0x56501f587a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec  6 04:33:18 np0005548918 NetworkManager[48884]: <info>  [1765013598.5018] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51662 uid=0 result="success"
Dec  6 04:33:18 np0005548918 ansible-async_wrapper.py[51660]: Module complete (51660)
Dec  6 04:33:18 np0005548918 ansible-async_wrapper.py[51659]: Done in kid B.
Dec  6 04:33:21 np0005548918 python3.9[52126]: ansible-ansible.legacy.async_status Invoked with jid=j958529382166.51656 mode=status _async_dir=/root/.ansible_async
Dec  6 04:33:21 np0005548918 python3.9[52226]: ansible-ansible.legacy.async_status Invoked with jid=j958529382166.51656 mode=cleanup _async_dir=/root/.ansible_async
Dec  6 04:33:21 np0005548918 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  6 04:33:22 np0005548918 python3.9[52380]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:33:23 np0005548918 python3.9[52503]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013602.0657032-928-84432628924763/.source.returncode _original_basename=.r8z9ggh0 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:24 np0005548918 python3.9[52655]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:33:24 np0005548918 python3.9[52779]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013603.6938243-976-178618838969813/.source.cfg _original_basename=.16ifsttr follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:25 np0005548918 python3.9[52931]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:33:26 np0005548918 systemd[1]: Reloading Network Manager...
Dec  6 04:33:26 np0005548918 NetworkManager[48884]: <info>  [1765013606.0240] audit: op="reload" arg="0" pid=52935 uid=0 result="success"
Dec  6 04:33:26 np0005548918 NetworkManager[48884]: <info>  [1765013606.0252] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec  6 04:33:26 np0005548918 systemd[1]: Reloaded Network Manager.
Dec  6 04:33:26 np0005548918 systemd-logind[800]: Session 11 logged out. Waiting for processes to exit.
Dec  6 04:33:26 np0005548918 systemd[1]: session-11.scope: Deactivated successfully.
Dec  6 04:33:26 np0005548918 systemd[1]: session-11.scope: Consumed 56.745s CPU time.
Dec  6 04:33:26 np0005548918 systemd-logind[800]: Removed session 11.
Dec  6 04:33:32 np0005548918 systemd-logind[800]: New session 12 of user zuul.
Dec  6 04:33:32 np0005548918 systemd[1]: Started Session 12 of User zuul.
Dec  6 04:33:33 np0005548918 python3.9[53119]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:33:34 np0005548918 python3.9[53274]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:33:36 np0005548918 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  6 04:33:37 np0005548918 python3.9[53469]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:33:37 np0005548918 systemd[1]: session-12.scope: Deactivated successfully.
Dec  6 04:33:37 np0005548918 systemd[1]: session-12.scope: Consumed 2.780s CPU time.
Dec  6 04:33:37 np0005548918 systemd-logind[800]: Session 12 logged out. Waiting for processes to exit.
Dec  6 04:33:37 np0005548918 systemd-logind[800]: Removed session 12.
Dec  6 04:33:43 np0005548918 systemd-logind[800]: New session 13 of user zuul.
Dec  6 04:33:43 np0005548918 systemd[1]: Started Session 13 of User zuul.
Dec  6 04:33:44 np0005548918 python3.9[53650]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:33:46 np0005548918 python3.9[53804]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:33:47 np0005548918 python3.9[53961]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:33:48 np0005548918 python3.9[54045]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:33:50 np0005548918 python3.9[54198]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:33:52 np0005548918 python3.9[54393]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:53 np0005548918 python3.9[54545]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:33:53 np0005548918 systemd[1]: var-lib-containers-storage-overlay-compat152436802-merged.mount: Deactivated successfully.
Dec  6 04:33:53 np0005548918 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck3185175904-merged.mount: Deactivated successfully.
Dec  6 04:33:53 np0005548918 podman[54546]: 2025-12-06 09:33:53.211848257 +0000 UTC m=+0.069668253 system refresh
Dec  6 04:33:54 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:33:54 np0005548918 python3.9[54709]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:33:55 np0005548918 python3.9[54832]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013633.5409899-199-187082232260650/.source.json follow=False _original_basename=podman_network_config.j2 checksum=a27882b0da67cef135d266724f14f72a5d90ab71 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:33:55 np0005548918 python3.9[54984]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:33:56 np0005548918 python3.9[55107]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013635.2735932-244-31024757895701/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:33:57 np0005548918 python3.9[55259]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:33:58 np0005548918 python3.9[55411]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:33:58 np0005548918 python3.9[55563]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:33:59 np0005548918 python3.9[55715]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:34:00 np0005548918 python3.9[55867]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:34:03 np0005548918 python3.9[56020]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:34:04 np0005548918 python3.9[56174]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:34:05 np0005548918 python3.9[56326]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:34:06 np0005548918 python3.9[56478]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:34:07 np0005548918 python3.9[56631]: ansible-service_facts Invoked
Dec  6 04:34:07 np0005548918 network[56648]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:34:07 np0005548918 network[56649]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:34:07 np0005548918 network[56650]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:34:14 np0005548918 python3.9[57102]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:34:18 np0005548918 python3.9[57255]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec  6 04:34:19 np0005548918 python3.9[57407]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:20 np0005548918 python3.9[57532]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013659.1441627-677-267862031649900/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:21 np0005548918 python3.9[57686]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:21 np0005548918 python3.9[57811]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013660.798782-724-3306756101178/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:23 np0005548918 python3.9[57965]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:25 np0005548918 python3.9[58119]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:34:26 np0005548918 python3.9[58203]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:34:28 np0005548918 python3.9[58357]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:34:29 np0005548918 python3.9[58441]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:34:30 np0005548918 chronyd[789]: chronyd exiting
Dec  6 04:34:30 np0005548918 systemd[1]: Stopping NTP client/server...
Dec  6 04:34:30 np0005548918 systemd[1]: chronyd.service: Deactivated successfully.
Dec  6 04:34:30 np0005548918 systemd[1]: Stopped NTP client/server.
Dec  6 04:34:30 np0005548918 systemd[1]: Starting NTP client/server...
Dec  6 04:34:30 np0005548918 chronyd[58449]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  6 04:34:30 np0005548918 chronyd[58449]: Frequency -26.574 +/- 0.958 ppm read from /var/lib/chrony/drift
Dec  6 04:34:30 np0005548918 chronyd[58449]: Loaded seccomp filter (level 2)
Dec  6 04:34:30 np0005548918 systemd[1]: Started NTP client/server.
Dec  6 04:34:30 np0005548918 systemd[1]: session-13.scope: Deactivated successfully.
Dec  6 04:34:30 np0005548918 systemd[1]: session-13.scope: Consumed 29.595s CPU time.
Dec  6 04:34:30 np0005548918 systemd-logind[800]: Session 13 logged out. Waiting for processes to exit.
Dec  6 04:34:30 np0005548918 systemd-logind[800]: Removed session 13.
Dec  6 04:34:36 np0005548918 systemd-logind[800]: New session 14 of user zuul.
Dec  6 04:34:36 np0005548918 systemd[1]: Started Session 14 of User zuul.
Dec  6 04:34:36 np0005548918 python3.9[58630]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:38 np0005548918 python3.9[58782]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:38 np0005548918 python3.9[58905]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013677.2831576-64-70792037091592/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:39 np0005548918 systemd[1]: session-14.scope: Deactivated successfully.
Dec  6 04:34:39 np0005548918 systemd[1]: session-14.scope: Consumed 1.949s CPU time.
Dec  6 04:34:39 np0005548918 systemd-logind[800]: Session 14 logged out. Waiting for processes to exit.
Dec  6 04:34:39 np0005548918 systemd-logind[800]: Removed session 14.
Dec  6 04:34:45 np0005548918 systemd-logind[800]: New session 15 of user zuul.
Dec  6 04:34:45 np0005548918 systemd[1]: Started Session 15 of User zuul.
Dec  6 04:34:46 np0005548918 python3.9[59083]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:34:48 np0005548918 python3.9[59239]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:49 np0005548918 python3.9[59414]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:49 np0005548918 python3.9[59537]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1765013688.5153952-85-235120883146101/.source.json _original_basename=.489dejfh follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:51 np0005548918 python3.9[59689]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:51 np0005548918 python3.9[59812]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013690.681579-156-269048156829957/.source _original_basename=.8b4vrgik follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:52 np0005548918 python3.9[59964]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:34:53 np0005548918 python3.9[60117]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:54 np0005548918 python3.9[60240]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013693.0672407-229-105689121567817/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:34:55 np0005548918 python3.9[60392]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:55 np0005548918 python3.9[60515]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013694.5412357-229-206666800416682/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:34:56 np0005548918 python3.9[60667]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:57 np0005548918 python3.9[60819]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:58 np0005548918 python3.9[60942]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013696.9270418-337-64794679580016/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:34:59 np0005548918 python3.9[61094]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:34:59 np0005548918 python3.9[61217]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013698.643265-383-13676712565204/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:01 np0005548918 python3.9[61369]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:35:01 np0005548918 systemd[1]: Reloading.
Dec  6 04:35:01 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:35:01 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:35:01 np0005548918 systemd[1]: Reloading.
Dec  6 04:35:01 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:35:01 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:35:01 np0005548918 systemd[1]: Starting EDPM Container Shutdown...
Dec  6 04:35:01 np0005548918 systemd[1]: Finished EDPM Container Shutdown.
Dec  6 04:35:02 np0005548918 python3.9[61595]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:03 np0005548918 python3.9[61718]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013702.049745-452-50829216656297/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:04 np0005548918 python3.9[61870]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:04 np0005548918 python3.9[61993]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013703.5040672-497-225089803387200/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:05 np0005548918 python3.9[62145]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:35:05 np0005548918 systemd[1]: Reloading.
Dec  6 04:35:05 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:35:05 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:35:06 np0005548918 systemd[1]: Reloading.
Dec  6 04:35:06 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:35:06 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:35:06 np0005548918 systemd[1]: Starting Create netns directory...
Dec  6 04:35:06 np0005548918 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  6 04:35:06 np0005548918 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  6 04:35:06 np0005548918 systemd[1]: Finished Create netns directory.
Dec  6 04:35:07 np0005548918 python3.9[62371]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:35:07 np0005548918 network[62388]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:35:07 np0005548918 network[62389]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:35:07 np0005548918 network[62390]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:35:13 np0005548918 python3.9[62652]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:35:13 np0005548918 systemd[1]: Reloading.
Dec  6 04:35:13 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:35:13 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:35:13 np0005548918 systemd[1]: Stopping IPv4 firewall with iptables...
Dec  6 04:35:14 np0005548918 iptables.init[62692]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec  6 04:35:14 np0005548918 iptables.init[62692]: iptables: Flushing firewall rules: [  OK  ]
Dec  6 04:35:14 np0005548918 systemd[1]: iptables.service: Deactivated successfully.
Dec  6 04:35:14 np0005548918 systemd[1]: Stopped IPv4 firewall with iptables.
Dec  6 04:35:15 np0005548918 python3.9[62890]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:35:17 np0005548918 python3.9[63044]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:35:17 np0005548918 systemd[1]: Reloading.
Dec  6 04:35:17 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:35:17 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:35:17 np0005548918 systemd[1]: Starting Netfilter Tables...
Dec  6 04:35:17 np0005548918 systemd[1]: Finished Netfilter Tables.
Dec  6 04:35:18 np0005548918 python3.9[63237]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:35:19 np0005548918 python3.9[63390]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:20 np0005548918 python3.9[63515]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013719.492327-703-239641196084004/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:21 np0005548918 python3.9[63668]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:35:21 np0005548918 systemd[1]: Reloading OpenSSH server daemon...
Dec  6 04:35:21 np0005548918 systemd[1]: Reloaded OpenSSH server daemon.
Dec  6 04:35:23 np0005548918 python3.9[63824]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:24 np0005548918 python3.9[63976]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:25 np0005548918 python3.9[64099]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013723.9000623-797-58811978776229/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:26 np0005548918 python3.9[64251]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  6 04:35:26 np0005548918 systemd[1]: Starting Time & Date Service...
Dec  6 04:35:26 np0005548918 systemd[1]: Started Time & Date Service.
Dec  6 04:35:28 np0005548918 python3.9[64407]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:29 np0005548918 python3.9[64559]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:29 np0005548918 python3.9[64682]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013728.6548758-902-209399453670170/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:30 np0005548918 python3.9[64834]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:31 np0005548918 python3.9[64957]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013730.2050064-947-216944906649848/.source.yaml _original_basename=.77j5icm1 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:32 np0005548918 python3.9[65109]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:32 np0005548918 python3.9[65232]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013731.7681367-992-122112861777743/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:33 np0005548918 python3.9[65384]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:35:34 np0005548918 python3.9[65537]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:35:35 np0005548918 python3[65690]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  6 04:35:36 np0005548918 python3.9[65842]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:36 np0005548918 python3.9[65965]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013735.770513-1108-216537258955572/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:37 np0005548918 python3.9[66117]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:38 np0005548918 python3.9[66240]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013737.328515-1154-24157296092382/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:39 np0005548918 python3.9[66392]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:40 np0005548918 python3.9[66515]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013738.9211771-1198-40304925994933/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:40 np0005548918 python3.9[66667]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:41 np0005548918 python3.9[66790]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013740.4767325-1244-128806667332099/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:42 np0005548918 python3.9[66942]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:35:43 np0005548918 python3.9[67065]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013741.997183-1289-40308137916401/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:44 np0005548918 python3.9[67217]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:44 np0005548918 python3.9[67369]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:35:46 np0005548918 python3.9[67528]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:47 np0005548918 python3.9[67681]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:48 np0005548918 python3.9[67833]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:35:49 np0005548918 python3.9[67985]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  6 04:35:50 np0005548918 python3.9[68138]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  6 04:35:51 np0005548918 systemd[1]: session-15.scope: Deactivated successfully.
Dec  6 04:35:51 np0005548918 systemd[1]: session-15.scope: Consumed 41.352s CPU time.
Dec  6 04:35:51 np0005548918 systemd-logind[800]: Session 15 logged out. Waiting for processes to exit.
Dec  6 04:35:51 np0005548918 systemd-logind[800]: Removed session 15.
Dec  6 04:35:56 np0005548918 systemd-logind[800]: New session 16 of user zuul.
Dec  6 04:35:56 np0005548918 systemd[1]: Started Session 16 of User zuul.
Dec  6 04:35:56 np0005548918 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  6 04:35:57 np0005548918 python3.9[68321]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec  6 04:35:58 np0005548918 python3.9[68473]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:35:59 np0005548918 python3.9[68625]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:36:00 np0005548918 python3.9[68777]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCtvqYC0W0zPSX/plyJvm0q1VGDScYTNlcCdllukOe81JRfU3GhVusPZOX0xRSaLP/lmXtfqWcbBRCkLsmFrAo2EHn1CMqMr5WkhY4+rgApF+MGLDOUo57tlKZLPIwdL0SSY/Qv8lBfrqr7LUDZ7fTTTbqTzim/bncxg/u0KxSWBdvjfmYi13SwO65wDkFqSVYa3h8DNij6cRRjQ0fJuJ9Da860hmMnqo9GJMU6dq3zMXXn3YfuF4E4M0UQdlWmVW4EwBTzsfA1XYbSpW7VdRJw6esB4vZ9/Succj+XZiANoDqL9gXSEjNXVVWVbL/7aGJJF9LLQ3VVxmHdbYs1NcTI6Yy9d61zDJHnK/nlYHMhmAHxiDsZEpv0xF72LLzaI86xxvnbx4eUpnyW6LnKiUCYUAUrWIMpLiIbWUxeIoYmj9rqLhwlo5kCy7WdCYYEMTtGI53oIyU0EbXf/r4WAuzmqpVRPyc2Sd5tYD4aXh1JZLUcZy+NLR0Y4SA8RflKFcs=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFDJYF6pUvFgGUbY2QEOHAq7ZEhRQJUqPTVPOuTyb476#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPJ19afQPeSMtr3O9L1fe5+bNzTAsOOCA5fLihUdryDYc29KKD+0XABHKIvqeefcCsIBjZRA//9OzCUftfvXK9A=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAiB67qk/R3IfGpcAH1Ojopc8KX94De+Kxs31cKQLD04X+4QRXPRdMxU85LOhN58eKoHaBi8cgqk7+dvRypGD5vbtbRN9r0VN7tGwiSQTlVFbEuhn0AEbnRwNAMWEEMHO9kEjufP4N2zEEhtQBXy9oO2tMX3+BX4Z3YZZMQyZUgohdBHp2VCul9VdRuo0oHSr8HHm0nN61dMjalnThmgkGAu5hG8qhkWT4i9hroSKBsR5kVBUFTqdXekYkVy4YIYfM2lBXiMOFHtvr1a+KOyIfgWMb7GBPW7oKqtzCfVgSbGaUhSvGzs1OWt3U/PjjapIlmDnwD5ukzVxWV5ldh0vA48tXh5R1wqAoN5/Y/RiAKaY2kd/fvtkhvVDGZluXOz5jJ02IFHm+v4dP3Ig8YOuS5BEkWFuJHkblW0t/+4siTHWwmGEuvUI6y8Gb2pGcBKsWCJtLePYzT09IAmrjwO0jAgbWy0nvCZ+SKlbBBrXP6OgNgMkA+GH9iGOl6FOuRok=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGYNj3LmNvR0emoQHuuy9NKXPivs/dznunVy8GExnJl8#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJhKmGSvg8FMw16qKPzk6Pyj+OHkN3bmk20mts1PdCRcNRnn9sT1DgI6U8Aze1tjGPujT4eDL+Y9r/hsrfM4qDc=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDneZurSARwLaZA1xEymzXlvVAPvP8u0PCrqXuMYD5ewImDDChRITnk4XHKT/DUfrSJf9/7oJsddEbLRjhCtedqrMZsCkWz1BxtCmPBuvz2LfFhEn27TjqYLctOVGigQGsj6ILvPOzzLiapd93yApWDmH6P0un/ltmdM0iZLygNpzG3HLF8STBXzlo/8slci69Em7XppcrOpl1TS7DaVlpNcRQvo9pFuIrbMD9g0DOdMwk5YCH6g7OzGWqq0gt0YUOztmsqxWHKav3E0SXAD/vkgRc/1ZCNGFNSvf0dIgimCF3xlNWrppnvNgQ1BRqiQ7RArlOp1bVg0Ugdce6f4TIrq36Ois2U5+/myF5WQ7l9hRMRvoP64hSSsRAIDobTI/zMStUP3iZPFngxDxwQtpydHfFGywBL9811c42U7JsGxE8890uOIDk/oOkyhSH6KHQCPFjmKBJ98nT01lgnXyFSNOqds6QOYBasUWNFWd2wS7YpTheGlVVM8bk/gB4K2L0=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOMkn8zp09tRuEaH/bUoP0rYj+dziM1KcqMKxOgM9K1U#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCrMdvJJYP0cflC7RDFsxwr66nSp9R7QU726CAfJcKLw6vHh8Z9Lw5wLH0kiaSpsb6SAPffloplHEDiwTOkghOc=#012 create=True mode=0644 path=/tmp/ansible.ihcvzdao state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:36:01 np0005548918 python3.9[68929]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.ihcvzdao' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:36:02 np0005548918 python3.9[69083]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.ihcvzdao state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:36:02 np0005548918 systemd[1]: session-16.scope: Deactivated successfully.
Dec  6 04:36:02 np0005548918 systemd[1]: session-16.scope: Consumed 4.274s CPU time.
Dec  6 04:36:02 np0005548918 systemd-logind[800]: Session 16 logged out. Waiting for processes to exit.
Dec  6 04:36:02 np0005548918 systemd-logind[800]: Removed session 16.
Dec  6 04:36:08 np0005548918 systemd-logind[800]: New session 17 of user zuul.
Dec  6 04:36:08 np0005548918 systemd[1]: Started Session 17 of User zuul.
Dec  6 04:36:09 np0005548918 python3.9[69261]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:36:10 np0005548918 python3.9[69417]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  6 04:36:11 np0005548918 python3.9[69571]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:36:12 np0005548918 python3.9[69724]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:36:13 np0005548918 python3.9[69877]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:36:14 np0005548918 python3.9[70031]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:36:15 np0005548918 python3.9[70186]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:36:16 np0005548918 systemd[1]: session-17.scope: Deactivated successfully.
Dec  6 04:36:16 np0005548918 systemd[1]: session-17.scope: Consumed 5.301s CPU time.
Dec  6 04:36:16 np0005548918 systemd-logind[800]: Session 17 logged out. Waiting for processes to exit.
Dec  6 04:36:16 np0005548918 systemd-logind[800]: Removed session 17.
Dec  6 04:36:21 np0005548918 systemd-logind[800]: New session 18 of user zuul.
Dec  6 04:36:21 np0005548918 systemd[1]: Started Session 18 of User zuul.
Dec  6 04:36:22 np0005548918 python3.9[70365]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:36:23 np0005548918 python3.9[70521]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:36:24 np0005548918 python3.9[70605]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  6 04:36:27 np0005548918 python3.9[70756]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:36:28 np0005548918 python3.9[70907]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  6 04:36:29 np0005548918 python3.9[71057]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:36:29 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:36:30 np0005548918 python3.9[71208]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:36:30 np0005548918 systemd[1]: session-18.scope: Deactivated successfully.
Dec  6 04:36:30 np0005548918 systemd[1]: session-18.scope: Consumed 6.246s CPU time.
Dec  6 04:36:30 np0005548918 systemd-logind[800]: Session 18 logged out. Waiting for processes to exit.
Dec  6 04:36:30 np0005548918 systemd-logind[800]: Removed session 18.
Dec  6 04:36:40 np0005548918 chronyd[58449]: Selected source 208.81.1.244 (pool.ntp.org)
Dec  6 04:36:41 np0005548918 systemd-logind[800]: New session 19 of user zuul.
Dec  6 04:36:41 np0005548918 systemd[1]: Started Session 19 of User zuul.
Dec  6 04:36:49 np0005548918 python3[71975]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:36:51 np0005548918 python3[72070]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  6 04:36:53 np0005548918 python3[72097]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  6 04:36:53 np0005548918 python3[72123]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:36:53 np0005548918 kernel: loop: module loaded
Dec  6 04:36:53 np0005548918 kernel: loop3: detected capacity change from 0 to 41943040
Dec  6 04:36:53 np0005548918 python3[72159]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:36:53 np0005548918 lvm[72162]: PV /dev/loop3 not used.
Dec  6 04:36:54 np0005548918 lvm[72171]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 04:36:54 np0005548918 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Dec  6 04:36:54 np0005548918 lvm[72173]:  1 logical volume(s) in volume group "ceph_vg0" now active
Dec  6 04:36:54 np0005548918 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Dec  6 04:36:54 np0005548918 python3[72252]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  6 04:36:54 np0005548918 python3[72325]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765013814.34596-36829-84778264703191/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:36:55 np0005548918 python3[72375]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:36:55 np0005548918 systemd[1]: Reloading.
Dec  6 04:36:55 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:36:55 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:36:56 np0005548918 systemd[1]: Starting Ceph OSD losetup...
Dec  6 04:36:56 np0005548918 bash[72415]: /dev/loop3: [64513]:4327938 (/var/lib/ceph-osd-0.img)
Dec  6 04:36:56 np0005548918 systemd[1]: Finished Ceph OSD losetup.
Dec  6 04:36:56 np0005548918 lvm[72417]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 04:36:56 np0005548918 lvm[72417]: VG ceph_vg0 finished
Dec  6 04:36:58 np0005548918 python3[72441]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:38:38 np0005548918 systemd-logind[800]: New session 20 of user ceph-admin.
Dec  6 04:38:38 np0005548918 systemd[1]: Created slice User Slice of UID 42477.
Dec  6 04:38:38 np0005548918 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec  6 04:38:38 np0005548918 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec  6 04:38:38 np0005548918 systemd[1]: Starting User Manager for UID 42477...
Dec  6 04:38:38 np0005548918 systemd-logind[800]: New session 22 of user ceph-admin.
Dec  6 04:38:38 np0005548918 systemd[72489]: Queued start job for default target Main User Target.
Dec  6 04:38:38 np0005548918 systemd[72489]: Created slice User Application Slice.
Dec  6 04:38:38 np0005548918 systemd[72489]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  6 04:38:38 np0005548918 systemd[72489]: Started Daily Cleanup of User's Temporary Directories.
Dec  6 04:38:38 np0005548918 systemd[72489]: Reached target Paths.
Dec  6 04:38:38 np0005548918 systemd[72489]: Reached target Timers.
Dec  6 04:38:38 np0005548918 systemd[72489]: Starting D-Bus User Message Bus Socket...
Dec  6 04:38:38 np0005548918 systemd[72489]: Starting Create User's Volatile Files and Directories...
Dec  6 04:38:38 np0005548918 systemd[72489]: Listening on D-Bus User Message Bus Socket.
Dec  6 04:38:38 np0005548918 systemd[72489]: Reached target Sockets.
Dec  6 04:38:38 np0005548918 systemd[72489]: Finished Create User's Volatile Files and Directories.
Dec  6 04:38:38 np0005548918 systemd[72489]: Reached target Basic System.
Dec  6 04:38:38 np0005548918 systemd[72489]: Reached target Main User Target.
Dec  6 04:38:38 np0005548918 systemd[72489]: Startup finished in 161ms.
Dec  6 04:38:38 np0005548918 systemd[1]: Started User Manager for UID 42477.
Dec  6 04:38:38 np0005548918 systemd[1]: Started Session 20 of User ceph-admin.
Dec  6 04:38:38 np0005548918 systemd[1]: Started Session 22 of User ceph-admin.
Dec  6 04:38:39 np0005548918 systemd-logind[800]: New session 23 of user ceph-admin.
Dec  6 04:38:39 np0005548918 systemd[1]: Started Session 23 of User ceph-admin.
Dec  6 04:38:39 np0005548918 systemd-logind[800]: New session 24 of user ceph-admin.
Dec  6 04:38:39 np0005548918 systemd[1]: Started Session 24 of User ceph-admin.
Dec  6 04:38:39 np0005548918 systemd-logind[800]: New session 25 of user ceph-admin.
Dec  6 04:38:39 np0005548918 systemd[1]: Started Session 25 of User ceph-admin.
Dec  6 04:38:40 np0005548918 systemd-logind[800]: New session 26 of user ceph-admin.
Dec  6 04:38:40 np0005548918 systemd[1]: Started Session 26 of User ceph-admin.
Dec  6 04:38:40 np0005548918 systemd-logind[800]: New session 27 of user ceph-admin.
Dec  6 04:38:40 np0005548918 systemd[1]: Started Session 27 of User ceph-admin.
Dec  6 04:38:41 np0005548918 systemd-logind[800]: New session 28 of user ceph-admin.
Dec  6 04:38:41 np0005548918 systemd[1]: Started Session 28 of User ceph-admin.
Dec  6 04:38:41 np0005548918 systemd-logind[800]: New session 29 of user ceph-admin.
Dec  6 04:38:41 np0005548918 systemd[1]: Started Session 29 of User ceph-admin.
Dec  6 04:38:41 np0005548918 systemd-logind[800]: New session 30 of user ceph-admin.
Dec  6 04:38:41 np0005548918 systemd[1]: Started Session 30 of User ceph-admin.
Dec  6 04:38:43 np0005548918 systemd-logind[800]: New session 31 of user ceph-admin.
Dec  6 04:38:43 np0005548918 systemd[1]: Started Session 31 of User ceph-admin.
Dec  6 04:38:43 np0005548918 systemd-logind[800]: New session 32 of user ceph-admin.
Dec  6 04:38:43 np0005548918 systemd[1]: Started Session 32 of User ceph-admin.
Dec  6 04:38:43 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:38 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:38 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:39 np0005548918 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73085 (sysctl)
Dec  6 04:39:39 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:39 np0005548918 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec  6 04:39:39 np0005548918 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec  6 04:39:40 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:40 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:40 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:39:44 np0005548918 systemd[1]: var-lib-containers-storage-overlay-compat299426897-lower\x2dmapped.mount: Deactivated successfully.
Dec  6 04:40:03 np0005548918 podman[73259]: 2025-12-06 09:40:03.226012172 +0000 UTC m=+22.473496810 container create 4fa2958138bf76304fed8fdef3e579370e005a58d8afa7558bafe35ab4a03c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_dubinsky, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:40:03 np0005548918 podman[73259]: 2025-12-06 09:40:03.211628932 +0000 UTC m=+22.459113590 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:03 np0005548918 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec  6 04:40:03 np0005548918 systemd[1]: Started libpod-conmon-4fa2958138bf76304fed8fdef3e579370e005a58d8afa7558bafe35ab4a03c32.scope.
Dec  6 04:40:03 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:40:03 np0005548918 podman[73259]: 2025-12-06 09:40:03.304842973 +0000 UTC m=+22.552327611 container init 4fa2958138bf76304fed8fdef3e579370e005a58d8afa7558bafe35ab4a03c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_dubinsky, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Dec  6 04:40:03 np0005548918 podman[73259]: 2025-12-06 09:40:03.312672103 +0000 UTC m=+22.560156741 container start 4fa2958138bf76304fed8fdef3e579370e005a58d8afa7558bafe35ab4a03c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_dubinsky, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  6 04:40:03 np0005548918 podman[73259]: 2025-12-06 09:40:03.316758004 +0000 UTC m=+22.564242672 container attach 4fa2958138bf76304fed8fdef3e579370e005a58d8afa7558bafe35ab4a03c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_dubinsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Dec  6 04:40:03 np0005548918 vigilant_dubinsky[73332]: 167 167
Dec  6 04:40:03 np0005548918 systemd[1]: libpod-4fa2958138bf76304fed8fdef3e579370e005a58d8afa7558bafe35ab4a03c32.scope: Deactivated successfully.
Dec  6 04:40:03 np0005548918 podman[73337]: 2025-12-06 09:40:03.353402465 +0000 UTC m=+0.022556082 container died 4fa2958138bf76304fed8fdef3e579370e005a58d8afa7558bafe35ab4a03c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_dubinsky, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:40:03 np0005548918 systemd[1]: var-lib-containers-storage-overlay-10b7d5c940f0ac9a8a975d3895ea6a836e9bace711bccd17b8ec4d33f633a26a-merged.mount: Deactivated successfully.
Dec  6 04:40:03 np0005548918 podman[73337]: 2025-12-06 09:40:03.387862306 +0000 UTC m=+0.057015923 container remove 4fa2958138bf76304fed8fdef3e579370e005a58d8afa7558bafe35ab4a03c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_dubinsky, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1)
Dec  6 04:40:03 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:40:03 np0005548918 systemd[1]: libpod-conmon-4fa2958138bf76304fed8fdef3e579370e005a58d8afa7558bafe35ab4a03c32.scope: Deactivated successfully.
Dec  6 04:40:03 np0005548918 podman[73359]: 2025-12-06 09:40:03.548663738 +0000 UTC m=+0.050963991 container create c7f5b57661dbe6f1e963d8e9ddcefc6ae8c8a1c1b37a11e8e748e971f834d6ee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_noether, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Dec  6 04:40:03 np0005548918 systemd[1]: Started libpod-conmon-c7f5b57661dbe6f1e963d8e9ddcefc6ae8c8a1c1b37a11e8e748e971f834d6ee.scope.
Dec  6 04:40:03 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:40:03 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39507eb0589b7329b30d8935cd87eaffc7e092b93916bc770bdbe3d295582f8e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:03 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39507eb0589b7329b30d8935cd87eaffc7e092b93916bc770bdbe3d295582f8e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:03 np0005548918 podman[73359]: 2025-12-06 09:40:03.521149528 +0000 UTC m=+0.023449841 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:03 np0005548918 podman[73359]: 2025-12-06 09:40:03.623744297 +0000 UTC m=+0.126044540 container init c7f5b57661dbe6f1e963d8e9ddcefc6ae8c8a1c1b37a11e8e748e971f834d6ee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_noether, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:40:03 np0005548918 podman[73359]: 2025-12-06 09:40:03.629656397 +0000 UTC m=+0.131956660 container start c7f5b57661dbe6f1e963d8e9ddcefc6ae8c8a1c1b37a11e8e748e971f834d6ee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_noether, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Dec  6 04:40:03 np0005548918 podman[73359]: 2025-12-06 09:40:03.63383088 +0000 UTC m=+0.136131123 container attach c7f5b57661dbe6f1e963d8e9ddcefc6ae8c8a1c1b37a11e8e748e971f834d6ee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_noether, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:40:04 np0005548918 sharp_noether[73375]: [
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:    {
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:        "available": false,
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:        "being_replaced": false,
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:        "ceph_device_lvm": false,
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:        "device_id": "QEMU_DVD-ROM_QM00001",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:        "lsm_data": {},
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:        "lvs": [],
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:        "path": "/dev/sr0",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:        "rejected_reasons": [
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "Insufficient space (<5GB)",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "Has a FileSystem"
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:        ],
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:        "sys_api": {
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "actuators": null,
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "device_nodes": [
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:                "sr0"
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            ],
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "devname": "sr0",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "human_readable_size": "482.00 KB",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "id_bus": "ata",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "model": "QEMU DVD-ROM",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "nr_requests": "2",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "parent": "/dev/sr0",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "partitions": {},
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "path": "/dev/sr0",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "removable": "1",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "rev": "2.5+",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "ro": "0",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "rotational": "1",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "sas_address": "",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "sas_device_handle": "",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "scheduler_mode": "mq-deadline",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "sectors": 0,
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "sectorsize": "2048",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "size": 493568.0,
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "support_discard": "2048",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "type": "disk",
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:            "vendor": "QEMU"
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:        }
Dec  6 04:40:04 np0005548918 sharp_noether[73375]:    }
Dec  6 04:40:04 np0005548918 sharp_noether[73375]: ]
Dec  6 04:40:04 np0005548918 systemd[1]: libpod-c7f5b57661dbe6f1e963d8e9ddcefc6ae8c8a1c1b37a11e8e748e971f834d6ee.scope: Deactivated successfully.
Dec  6 04:40:04 np0005548918 podman[73359]: 2025-12-06 09:40:04.437949346 +0000 UTC m=+0.940249569 container died c7f5b57661dbe6f1e963d8e9ddcefc6ae8c8a1c1b37a11e8e748e971f834d6ee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_noether, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:40:04 np0005548918 systemd[1]: var-lib-containers-storage-overlay-39507eb0589b7329b30d8935cd87eaffc7e092b93916bc770bdbe3d295582f8e-merged.mount: Deactivated successfully.
Dec  6 04:40:04 np0005548918 podman[73359]: 2025-12-06 09:40:04.480911659 +0000 UTC m=+0.983211882 container remove c7f5b57661dbe6f1e963d8e9ddcefc6ae8c8a1c1b37a11e8e748e971f834d6ee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_noether, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec  6 04:40:04 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:40:04 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:40:04 np0005548918 systemd[1]: libpod-conmon-c7f5b57661dbe6f1e963d8e9ddcefc6ae8c8a1c1b37a11e8e748e971f834d6ee.scope: Deactivated successfully.
Dec  6 04:40:06 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:40:06 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:40:07 np0005548918 podman[75448]: 2025-12-06 09:40:07.045090893 +0000 UTC m=+0.035602470 container create 54628faec01604938711f8a4c7f92dc92f6ab0ded2629a431392efaf7e48a2e1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_sammet, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:40:07 np0005548918 systemd[1]: Started libpod-conmon-54628faec01604938711f8a4c7f92dc92f6ab0ded2629a431392efaf7e48a2e1.scope.
Dec  6 04:40:07 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:40:07 np0005548918 podman[75448]: 2025-12-06 09:40:07.119087828 +0000 UTC m=+0.109599425 container init 54628faec01604938711f8a4c7f92dc92f6ab0ded2629a431392efaf7e48a2e1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_sammet, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:40:07 np0005548918 podman[75448]: 2025-12-06 09:40:07.02968146 +0000 UTC m=+0.020193067 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:07 np0005548918 podman[75448]: 2025-12-06 09:40:07.128325553 +0000 UTC m=+0.118837120 container start 54628faec01604938711f8a4c7f92dc92f6ab0ded2629a431392efaf7e48a2e1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_sammet, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:40:07 np0005548918 podman[75448]: 2025-12-06 09:40:07.132033682 +0000 UTC m=+0.122545289 container attach 54628faec01604938711f8a4c7f92dc92f6ab0ded2629a431392efaf7e48a2e1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_sammet, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:40:07 np0005548918 pensive_sammet[75464]: 167 167
Dec  6 04:40:07 np0005548918 systemd[1]: libpod-54628faec01604938711f8a4c7f92dc92f6ab0ded2629a431392efaf7e48a2e1.scope: Deactivated successfully.
Dec  6 04:40:07 np0005548918 podman[75448]: 2025-12-06 09:40:07.133803558 +0000 UTC m=+0.124315145 container died 54628faec01604938711f8a4c7f92dc92f6ab0ded2629a431392efaf7e48a2e1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_sammet, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:40:07 np0005548918 podman[75448]: 2025-12-06 09:40:07.168412495 +0000 UTC m=+0.158924072 container remove 54628faec01604938711f8a4c7f92dc92f6ab0ded2629a431392efaf7e48a2e1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_sammet, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec  6 04:40:07 np0005548918 systemd[1]: libpod-conmon-54628faec01604938711f8a4c7f92dc92f6ab0ded2629a431392efaf7e48a2e1.scope: Deactivated successfully.
Dec  6 04:40:07 np0005548918 podman[75481]: 2025-12-06 09:40:07.239150596 +0000 UTC m=+0.043587095 container create 961ecdf2d2ebea0a2b6508d3507b87f07346c0642881ddef1ab44d561753f836 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jennings, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Dec  6 04:40:07 np0005548918 systemd[1]: Started libpod-conmon-961ecdf2d2ebea0a2b6508d3507b87f07346c0642881ddef1ab44d561753f836.scope.
Dec  6 04:40:07 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:40:07 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75526dd52531fe9493f62cc2825f3ab33d29743a615c237c82abcd7e801ee66e/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:07 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75526dd52531fe9493f62cc2825f3ab33d29743a615c237c82abcd7e801ee66e/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:07 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75526dd52531fe9493f62cc2825f3ab33d29743a615c237c82abcd7e801ee66e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:07 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75526dd52531fe9493f62cc2825f3ab33d29743a615c237c82abcd7e801ee66e/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:07 np0005548918 podman[75481]: 2025-12-06 09:40:07.306453138 +0000 UTC m=+0.110889677 container init 961ecdf2d2ebea0a2b6508d3507b87f07346c0642881ddef1ab44d561753f836 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jennings, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Dec  6 04:40:07 np0005548918 podman[75481]: 2025-12-06 09:40:07.220103907 +0000 UTC m=+0.024540426 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:07 np0005548918 podman[75481]: 2025-12-06 09:40:07.3156071 +0000 UTC m=+0.120043619 container start 961ecdf2d2ebea0a2b6508d3507b87f07346c0642881ddef1ab44d561753f836 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jennings, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:40:07 np0005548918 podman[75481]: 2025-12-06 09:40:07.319824075 +0000 UTC m=+0.124260624 container attach 961ecdf2d2ebea0a2b6508d3507b87f07346c0642881ddef1ab44d561753f836 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jennings, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Dec  6 04:40:07 np0005548918 systemd[1]: libpod-961ecdf2d2ebea0a2b6508d3507b87f07346c0642881ddef1ab44d561753f836.scope: Deactivated successfully.
Dec  6 04:40:07 np0005548918 podman[75481]: 2025-12-06 09:40:07.407362374 +0000 UTC m=+0.211798903 container died 961ecdf2d2ebea0a2b6508d3507b87f07346c0642881ddef1ab44d561753f836 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jennings, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:40:07 np0005548918 podman[75481]: 2025-12-06 09:40:07.441525296 +0000 UTC m=+0.245961795 container remove 961ecdf2d2ebea0a2b6508d3507b87f07346c0642881ddef1ab44d561753f836 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jennings, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  6 04:40:07 np0005548918 systemd[1]: libpod-conmon-961ecdf2d2ebea0a2b6508d3507b87f07346c0642881ddef1ab44d561753f836.scope: Deactivated successfully.
Dec  6 04:40:07 np0005548918 systemd[1]: Reloading.
Dec  6 04:40:07 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:07 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:07 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:40:07 np0005548918 systemd[1]: Reloading.
Dec  6 04:40:07 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:07 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:07 np0005548918 systemd[1]: Reached target All Ceph clusters and services.
Dec  6 04:40:07 np0005548918 systemd[1]: Reloading.
Dec  6 04:40:08 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:08 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:08 np0005548918 systemd[1]: Reached target Ceph cluster 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:40:08 np0005548918 systemd[1]: Reloading.
Dec  6 04:40:08 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:08 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:08 np0005548918 systemd[1]: Reloading.
Dec  6 04:40:08 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:08 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:08 np0005548918 systemd[1]: Created slice Slice /system/ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:40:08 np0005548918 systemd[1]: Reached target System Time Set.
Dec  6 04:40:08 np0005548918 systemd[1]: Reached target System Time Synchronized.
Dec  6 04:40:08 np0005548918 systemd[1]: Starting Ceph mon.compute-2 for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:40:08 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:40:08 np0005548918 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  6 04:40:09 np0005548918 podman[75779]: 2025-12-06 09:40:08.945274869 +0000 UTC m=+0.028329337 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:09 np0005548918 podman[75779]: 2025-12-06 09:40:09.248735671 +0000 UTC m=+0.331790049 container create 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True)
Dec  6 04:40:09 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06c39cabf5d107ba59c1b23a9f9a646a7b1a22c02e00258240830d1a4eeb499f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:09 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06c39cabf5d107ba59c1b23a9f9a646a7b1a22c02e00258240830d1a4eeb499f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:09 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06c39cabf5d107ba59c1b23a9f9a646a7b1a22c02e00258240830d1a4eeb499f/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:09 np0005548918 podman[75779]: 2025-12-06 09:40:09.84305419 +0000 UTC m=+0.926108678 container init 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:40:09 np0005548918 podman[75779]: 2025-12-06 09:40:09.852546794 +0000 UTC m=+0.935601202 container start 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Dec  6 04:40:09 np0005548918 bash[75779]: 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684
Dec  6 04:40:09 np0005548918 systemd[1]: Started Ceph mon.compute-2 for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: set uid:gid to 167:167 (ceph:ceph)
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: pidfile_write: ignore empty --pid-file
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: load: jerasure load: lrc 
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: RocksDB version: 7.9.2
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Git sha 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Compile date 2025-07-17 03:12:14
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: DB SUMMARY
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: DB Session ID:  XHXHRBK6HNZMFGDONKUX
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: CURRENT file:  CURRENT
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: IDENTITY file:  IDENTITY
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                         Options.error_if_exists: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                       Options.create_if_missing: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                         Options.paranoid_checks: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                                     Options.env: 0x55784a9d2c20
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                                      Options.fs: PosixFileSystem
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                                Options.info_log: 0x55784c753a20
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                Options.max_file_opening_threads: 16
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                              Options.statistics: (nil)
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                               Options.use_fsync: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                       Options.max_log_file_size: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                         Options.allow_fallocate: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                        Options.use_direct_reads: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:          Options.create_missing_column_families: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                              Options.db_log_dir: 
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                                 Options.wal_dir: 
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                   Options.advise_random_on_open: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                    Options.write_buffer_manager: 0x55784c757900
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                            Options.rate_limiter: (nil)
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                  Options.unordered_write: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                               Options.row_cache: None
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                              Options.wal_filter: None
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.allow_ingest_behind: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.two_write_queues: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.manual_wal_flush: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.wal_compression: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.atomic_flush: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                 Options.log_readahead_size: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.allow_data_in_errors: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.db_host_id: __hostname__
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.max_background_jobs: 2
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.max_background_compactions: -1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.max_subcompactions: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.max_total_wal_size: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                          Options.max_open_files: -1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                          Options.bytes_per_sync: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:       Options.compaction_readahead_size: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                  Options.max_background_flushes: -1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Compression algorithms supported:
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: #011kZSTD supported: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: #011kXpressCompression supported: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: #011kBZip2Compression supported: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: #011kLZ4Compression supported: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: #011kZlibCompression supported: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: #011kSnappyCompression supported: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:           Options.merge_operator: 
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:        Options.compaction_filter: None
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55784c7525c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55784c777350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:        Options.write_buffer_size: 33554432
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:  Options.max_write_buffer_number: 2
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:          Options.compression: NoCompression
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.num_levels: 7
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 32a3339c-60f1-43dd-9342-fe763f9bb1ae
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014009893666, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec  6 04:40:09 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014010253607, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014010253863, "job": 1, "event": "recovery_finished"}
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55784c778e00
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: rocksdb: DB pointer 0x55784c882000
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.5 total, 0.5 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.36              0.00         1    0.360       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.36              0.00         1    0.360       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.36              0.00         1    0.360       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.36              0.00         1    0.360       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.5 total, 0.5 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.4 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.4 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55784c777350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(???) e0 preinit fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).mds e1 new map
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).mds e1 print_map#012e1#012btime 2025-12-06T09:37:41:285728+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: -1#012 #012No filesystems configured
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 0 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 1 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e16 e16: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e26 e26: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e27 e27: 2 total, 2 up, 2 in
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e27 crush map has features 3314933000852226048, adjusting msgr requires
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e27 crush map has features 288514051259236352, adjusting msgr requires
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e27 crush map has features 288514051259236352, adjusting msgr requires
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).osd e27 crush map has features 288514051259236352, adjusting msgr requires
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/2735601092' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/250124401' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/250124401' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:10 np0005548918 ceph-mon[75798]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Dec  6 04:40:12 np0005548918 ceph-mon[75798]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Dec  6 04:40:12 np0005548918 ceph-mon[75798]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Dec  6 04:40:12 np0005548918 ceph-mon[75798]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Dec  6 04:40:12 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  6 04:40:12 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Dec  6 04:40:12 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Dec  6 04:40:14 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Dec  6 04:40:18 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  6 04:40:18 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Dec  6 04:40:18 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Dec  6 04:40:18 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec  6 04:40:18 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Dec  6 04:40:19 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  6 04:40:19 np0005548918 ceph-mon[75798]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025,kernel_version=5.14.0-645.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,os=Linux}
Dec  6 04:40:19 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e28 e28: 2 total, 2 up, 2 in
Dec  6 04:40:19 np0005548918 ceph-mon[75798]: Deploying daemon mon.compute-1 on compute-1
Dec  6 04:40:19 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/3524701111' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Dec  6 04:40:19 np0005548918 ceph-mon[75798]: mon.compute-0 calling monitor election
Dec  6 04:40:19 np0005548918 ceph-mon[75798]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e28 _set_new_cache_sizes cache_size:1019933829 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:40:20 np0005548918 podman[75927]: 2025-12-06 09:40:20.456731213 +0000 UTC m=+0.053248323 container create 824fd0affc2562ccbb93967652dcc80cdf55186824da207eabba2dab6e9a902d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_swirles, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: mon.compute-2 calling monitor election
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: Health detail: HEALTH_WARN 3 pool(s) do not have an application enabled
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: [WRN] POOL_APP_NOT_ENABLED: 3 pool(s) do not have an application enabled
Dec  6 04:40:20 np0005548918 ceph-mon[75798]:    application not enabled on pool 'images'
Dec  6 04:40:20 np0005548918 ceph-mon[75798]:    application not enabled on pool 'cephfs.cephfs.meta'
Dec  6 04:40:20 np0005548918 ceph-mon[75798]:    application not enabled on pool 'cephfs.cephfs.data'
Dec  6 04:40:20 np0005548918 ceph-mon[75798]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/3524701111' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.oazbvn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.oazbvn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec  6 04:40:20 np0005548918 systemd[1]: Started libpod-conmon-824fd0affc2562ccbb93967652dcc80cdf55186824da207eabba2dab6e9a902d.scope.
Dec  6 04:40:20 np0005548918 podman[75927]: 2025-12-06 09:40:20.42222196 +0000 UTC m=+0.018739080 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:20 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:40:20 np0005548918 podman[75927]: 2025-12-06 09:40:20.596843302 +0000 UTC m=+0.193360432 container init 824fd0affc2562ccbb93967652dcc80cdf55186824da207eabba2dab6e9a902d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_swirles, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:40:20 np0005548918 podman[75927]: 2025-12-06 09:40:20.603718533 +0000 UTC m=+0.200235643 container start 824fd0affc2562ccbb93967652dcc80cdf55186824da207eabba2dab6e9a902d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_swirles, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:40:20 np0005548918 podman[75927]: 2025-12-06 09:40:20.607203834 +0000 UTC m=+0.203720944 container attach 824fd0affc2562ccbb93967652dcc80cdf55186824da207eabba2dab6e9a902d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_swirles, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:40:20 np0005548918 pedantic_swirles[75943]: 167 167
Dec  6 04:40:20 np0005548918 systemd[1]: libpod-824fd0affc2562ccbb93967652dcc80cdf55186824da207eabba2dab6e9a902d.scope: Deactivated successfully.
Dec  6 04:40:20 np0005548918 podman[75927]: 2025-12-06 09:40:20.610314243 +0000 UTC m=+0.206831343 container died 824fd0affc2562ccbb93967652dcc80cdf55186824da207eabba2dab6e9a902d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_swirles, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec  6 04:40:20 np0005548918 systemd[1]: var-lib-containers-storage-overlay-cd7e356e233cf2da3f0bff88f0270431ab60af7e937db47cc2e87c220bb29ff8-merged.mount: Deactivated successfully.
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Dec  6 04:40:20 np0005548918 podman[75927]: 2025-12-06 09:40:20.985651522 +0000 UTC m=+0.582168632 container remove 824fd0affc2562ccbb93967652dcc80cdf55186824da207eabba2dab6e9a902d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_swirles, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: paxos.1).electionLogic(10) init, last seen epoch 10
Dec  6 04:40:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  6 04:40:21 np0005548918 systemd[1]: Reloading.
Dec  6 04:40:21 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:21 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:21 np0005548918 systemd[1]: libpod-conmon-824fd0affc2562ccbb93967652dcc80cdf55186824da207eabba2dab6e9a902d.scope: Deactivated successfully.
Dec  6 04:40:21 np0005548918 systemd[1]: Reloading.
Dec  6 04:40:21 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:21 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:21 np0005548918 systemd[1]: Starting Ceph mgr.compute-2.oazbvn for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:40:21 np0005548918 podman[76089]: 2025-12-06 09:40:21.78951342 +0000 UTC m=+0.043585035 container create 4821735c91549e3508140534fa876d8b579a0b2bc573fd62060cdb22310657ee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Dec  6 04:40:21 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b04bea32d842e2034e5ca6ab4d10d8a309440f4c8176319fa9e7622fd4a14ca/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:21 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b04bea32d842e2034e5ca6ab4d10d8a309440f4c8176319fa9e7622fd4a14ca/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:21 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b04bea32d842e2034e5ca6ab4d10d8a309440f4c8176319fa9e7622fd4a14ca/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:21 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b04bea32d842e2034e5ca6ab4d10d8a309440f4c8176319fa9e7622fd4a14ca/merged/var/lib/ceph/mgr/ceph-compute-2.oazbvn supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:21 np0005548918 podman[76089]: 2025-12-06 09:40:21.844368784 +0000 UTC m=+0.098440379 container init 4821735c91549e3508140534fa876d8b579a0b2bc573fd62060cdb22310657ee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  6 04:40:21 np0005548918 podman[76089]: 2025-12-06 09:40:21.848992561 +0000 UTC m=+0.103064136 container start 4821735c91549e3508140534fa876d8b579a0b2bc573fd62060cdb22310657ee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:40:21 np0005548918 bash[76089]: 4821735c91549e3508140534fa876d8b579a0b2bc573fd62060cdb22310657ee
Dec  6 04:40:21 np0005548918 podman[76089]: 2025-12-06 09:40:21.77106474 +0000 UTC m=+0.025136345 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:21 np0005548918 systemd[1]: Started Ceph mgr.compute-2.oazbvn for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:40:21 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  6 04:40:22 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  6 04:40:22 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  6 04:40:23 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  6 04:40:24 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  6 04:40:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  6 04:40:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  6 04:40:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  6 04:40:26 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  6 04:40:26 np0005548918 ceph-mgr[76108]: set uid:gid to 167:167 (ceph:ceph)
Dec  6 04:40:26 np0005548918 ceph-mgr[76108]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  6 04:40:26 np0005548918 ceph-mgr[76108]: pidfile_write: ignore empty --pid-file
Dec  6 04:40:26 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'alerts'
Dec  6 04:40:26 np0005548918 ceph-mgr[76108]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:40:26 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'balancer'
Dec  6 04:40:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:26.435+0000 7f8e87e9f140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:40:26 np0005548918 ceph-mgr[76108]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:40:26 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'cephadm'
Dec  6 04:40:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:26.513+0000 7f8e87e9f140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:40:26 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e29 e29: 2 total, 2 up, 2 in
Dec  6 04:40:26 np0005548918 ceph-mon[75798]: mon.compute-0 calling monitor election
Dec  6 04:40:26 np0005548918 ceph-mon[75798]: mon.compute-2 calling monitor election
Dec  6 04:40:26 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/1898003818' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Dec  6 04:40:26 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:26 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:26 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:26 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:26 np0005548918 ceph-mon[75798]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Dec  6 04:40:26 np0005548918 ceph-mon[75798]: Health detail: HEALTH_WARN 3 pool(s) do not have an application enabled
Dec  6 04:40:26 np0005548918 ceph-mon[75798]: [WRN] POOL_APP_NOT_ENABLED: 3 pool(s) do not have an application enabled
Dec  6 04:40:26 np0005548918 ceph-mon[75798]:    application not enabled on pool 'images'
Dec  6 04:40:26 np0005548918 ceph-mon[75798]:    application not enabled on pool 'cephfs.cephfs.meta'
Dec  6 04:40:26 np0005548918 ceph-mon[75798]:    application not enabled on pool 'cephfs.cephfs.data'
Dec  6 04:40:26 np0005548918 ceph-mon[75798]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Dec  6 04:40:27 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'crash'
Dec  6 04:40:27 np0005548918 ceph-mgr[76108]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:40:27 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'dashboard'
Dec  6 04:40:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:27.320+0000 7f8e87e9f140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e30 e30: 2 total, 2 up, 2 in
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: mon.compute-1 calling monitor election
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/1898003818' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.sauzid", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.sauzid", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec  6 04:40:27 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/21529314' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Dec  6 04:40:27 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'devicehealth'
Dec  6 04:40:27 np0005548918 ceph-mgr[76108]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:40:27 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'diskprediction_local'
Dec  6 04:40:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:27.964+0000 7f8e87e9f140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:40:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  6 04:40:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  6 04:40:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]:  from numpy import show_config as show_numpy_config
Dec  6 04:40:28 np0005548918 ceph-mgr[76108]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:40:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:28.135+0000 7f8e87e9f140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:40:28 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'influx'
Dec  6 04:40:28 np0005548918 ceph-mgr[76108]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:40:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:28.209+0000 7f8e87e9f140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:40:28 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'insights'
Dec  6 04:40:28 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'iostat'
Dec  6 04:40:28 np0005548918 ceph-mgr[76108]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:40:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:28.342+0000 7f8e87e9f140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:40:28 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'k8sevents'
Dec  6 04:40:28 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'localpool'
Dec  6 04:40:28 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'mds_autoscaler'
Dec  6 04:40:28 np0005548918 ceph-mon[75798]: Deploying daemon mgr.compute-1.sauzid on compute-1
Dec  6 04:40:28 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:28 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:28 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:28 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:40:28 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/21529314' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Dec  6 04:40:29 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'mirroring'
Dec  6 04:40:29 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'nfs'
Dec  6 04:40:29 np0005548918 ceph-mgr[76108]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:29.349+0000 7f8e87e9f140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'orchestrator'
Dec  6 04:40:29 np0005548918 ceph-mgr[76108]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:29.567+0000 7f8e87e9f140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'osd_perf_query'
Dec  6 04:40:29 np0005548918 ceph-mgr[76108]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:29.643+0000 7f8e87e9f140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'osd_support'
Dec  6 04:40:29 np0005548918 ceph-mgr[76108]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'pg_autoscaler'
Dec  6 04:40:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:29.709+0000 7f8e87e9f140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548918 ceph-mgr[76108]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:29.787+0000 7f8e87e9f140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'progress'
Dec  6 04:40:29 np0005548918 ceph-mgr[76108]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:29.865+0000 7f8e87e9f140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:40:29 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'prometheus'
Dec  6 04:40:30 np0005548918 ceph-mgr[76108]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:40:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:30.201+0000 7f8e87e9f140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:40:30 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rbd_support'
Dec  6 04:40:30 np0005548918 ceph-mgr[76108]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:40:30 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'restful'
Dec  6 04:40:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:30.291+0000 7f8e87e9f140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:40:30 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rgw'
Dec  6 04:40:30 np0005548918 ceph-mgr[76108]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:40:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:30.707+0000 7f8e87e9f140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:40:30 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rook'
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'selftest'
Dec  6 04:40:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:31.243+0000 7f8e87e9f140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'snap_schedule'
Dec  6 04:40:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:31.309+0000 7f8e87e9f140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'stats'
Dec  6 04:40:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:31.382+0000 7f8e87e9f140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'status'
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'telegraf'
Dec  6 04:40:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:31.543+0000 7f8e87e9f140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:31.618+0000 7f8e87e9f140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'telemetry'
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'test_orchestrator'
Dec  6 04:40:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:31.775+0000 7f8e87e9f140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:31.988+0000 7f8e87e9f140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:40:31 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'volumes'
Dec  6 04:40:32 np0005548918 ceph-mgr[76108]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:40:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:32.250+0000 7f8e87e9f140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:40:32 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'zabbix'
Dec  6 04:40:32 np0005548918 ceph-mgr[76108]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:40:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:40:32.318+0000 7f8e87e9f140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:40:32 np0005548918 ceph-mgr[76108]: ms_deliver_dispatch: unhandled message 0x5601d05c0d00 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Dec  6 04:40:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e30 _set_new_cache_sizes cache_size:1020053174 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:40:33 np0005548918 ceph-mon[75798]: Health check cleared: POOL_APP_NOT_ENABLED (was: 2 pool(s) do not have an application enabled)
Dec  6 04:40:33 np0005548918 ceph-mon[75798]: Cluster is now healthy
Dec  6 04:40:33 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:34 np0005548918 podman[76232]: 2025-12-06 09:40:34.720110302 +0000 UTC m=+0.024106572 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:36 np0005548918 podman[76232]: 2025-12-06 09:40:36.629492812 +0000 UTC m=+1.933489092 container create 570b6b9464267318461b24219d3cd8afea09c892f790492d5cd43ac0248fbfee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wizardly_goldstine, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec  6 04:40:38 np0005548918 systemd[1]: Started libpod-conmon-570b6b9464267318461b24219d3cd8afea09c892f790492d5cd43ac0248fbfee.scope.
Dec  6 04:40:38 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:40:39 np0005548918 podman[76232]: 2025-12-06 09:40:39.449889836 +0000 UTC m=+4.753886116 container init 570b6b9464267318461b24219d3cd8afea09c892f790492d5cd43ac0248fbfee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wizardly_goldstine, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:40:39 np0005548918 podman[76232]: 2025-12-06 09:40:39.463004896 +0000 UTC m=+4.767001146 container start 570b6b9464267318461b24219d3cd8afea09c892f790492d5cd43ac0248fbfee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wizardly_goldstine, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:40:39 np0005548918 wizardly_goldstine[76248]: 167 167
Dec  6 04:40:39 np0005548918 systemd[1]: libpod-570b6b9464267318461b24219d3cd8afea09c892f790492d5cd43ac0248fbfee.scope: Deactivated successfully.
Dec  6 04:40:39 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:39 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/2318794964' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Dec  6 04:40:39 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:39 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:39 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/2318794964' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Dec  6 04:40:39 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:39 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec  6 04:40:39 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Dec  6 04:40:39 np0005548918 ceph-mon[75798]: Deploying daemon crash.compute-2 on compute-2
Dec  6 04:40:40 np0005548918 podman[76232]: 2025-12-06 09:40:40.274802698 +0000 UTC m=+5.578799048 container attach 570b6b9464267318461b24219d3cd8afea09c892f790492d5cd43ac0248fbfee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wizardly_goldstine, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Dec  6 04:40:40 np0005548918 podman[76232]: 2025-12-06 09:40:40.277612217 +0000 UTC m=+5.581608517 container died 570b6b9464267318461b24219d3cd8afea09c892f790492d5cd43ac0248fbfee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wizardly_goldstine, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:40:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e30 _set_new_cache_sizes cache_size:1020054711 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:40:40 np0005548918 systemd[1]: var-lib-containers-storage-overlay-03ab857df3b9034c39bda4c91168a7444e7ccff1c70fb120e85c158bd3f15c17-merged.mount: Deactivated successfully.
Dec  6 04:40:40 np0005548918 systemd[72489]: Starting Mark boot as successful...
Dec  6 04:40:40 np0005548918 systemd[72489]: Finished Mark boot as successful.
Dec  6 04:40:40 np0005548918 podman[76232]: 2025-12-06 09:40:40.360706403 +0000 UTC m=+5.664702643 container remove 570b6b9464267318461b24219d3cd8afea09c892f790492d5cd43ac0248fbfee (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wizardly_goldstine, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:40:40 np0005548918 systemd[1]: Reloading.
Dec  6 04:40:40 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:40 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:40 np0005548918 systemd[1]: libpod-conmon-570b6b9464267318461b24219d3cd8afea09c892f790492d5cd43ac0248fbfee.scope: Deactivated successfully.
Dec  6 04:40:40 np0005548918 systemd[1]: Reloading.
Dec  6 04:40:40 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:40 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:40 np0005548918 systemd[1]: Starting Ceph crash.compute-2 for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:40:41 np0005548918 podman[76394]: 2025-12-06 09:40:41.202267257 +0000 UTC m=+0.040267649 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:41 np0005548918 podman[76394]: 2025-12-06 09:40:41.814254001 +0000 UTC m=+0.652254403 container create 29aae73f62af9799337a7508d50b93469fcfc28d710761a3c3032f196e258964 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  6 04:40:41 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbb1c79b92a2aa85bf7171d7f2f1ed6ebe98aadc2b388e8a212d64be49ff22f0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:41 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbb1c79b92a2aa85bf7171d7f2f1ed6ebe98aadc2b388e8a212d64be49ff22f0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:41 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbb1c79b92a2aa85bf7171d7f2f1ed6ebe98aadc2b388e8a212d64be49ff22f0/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:41 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbb1c79b92a2aa85bf7171d7f2f1ed6ebe98aadc2b388e8a212d64be49ff22f0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:41 np0005548918 podman[76394]: 2025-12-06 09:40:41.987070676 +0000 UTC m=+0.825071108 container init 29aae73f62af9799337a7508d50b93469fcfc28d710761a3c3032f196e258964 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Dec  6 04:40:41 np0005548918 podman[76394]: 2025-12-06 09:40:41.99315738 +0000 UTC m=+0.831157782 container start 29aae73f62af9799337a7508d50b93469fcfc28d710761a3c3032f196e258964 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-2, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:40:41 np0005548918 bash[76394]: 29aae73f62af9799337a7508d50b93469fcfc28d710761a3c3032f196e258964
Dec  6 04:40:42 np0005548918 systemd[1]: Started Ceph crash.compute-2 for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:40:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-2[76409]: INFO:ceph-crash:pinging cluster to exercise our key
Dec  6 04:40:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-2[76409]: 2025-12-06T09:40:42.161+0000 7fe686a42640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec  6 04:40:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-2[76409]: 2025-12-06T09:40:42.161+0000 7fe686a42640 -1 AuthRegistry(0x7fe680069b10) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec  6 04:40:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-2[76409]: 2025-12-06T09:40:42.162+0000 7fe686a42640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec  6 04:40:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-2[76409]: 2025-12-06T09:40:42.162+0000 7fe686a42640 -1 AuthRegistry(0x7fe686a40ff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec  6 04:40:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-2[76409]: 2025-12-06T09:40:42.163+0000 7fe67f7fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec  6 04:40:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-2[76409]: 2025-12-06T09:40:42.164+0000 7fe684fb8640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec  6 04:40:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-2[76409]: 2025-12-06T09:40:42.167+0000 7fe67ffff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec  6 04:40:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-2[76409]: 2025-12-06T09:40:42.167+0000 7fe686a42640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Dec  6 04:40:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-2[76409]: [errno 13] RADOS permission denied (error connecting to the cluster)
Dec  6 04:40:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-2[76409]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Dec  6 04:40:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e30 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:40:45 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/1940510154' entity='client.admin' 
Dec  6 04:40:45 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:45 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:45 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:45 np0005548918 podman[76517]: 2025-12-06 09:40:45.628891659 +0000 UTC m=+0.023219894 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:45 np0005548918 podman[76517]: 2025-12-06 09:40:45.862364663 +0000 UTC m=+0.256692898 container create 25d9e846377d01fd71f7b5c6a21391e76468b57514087e5c3795181a3506f2dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_chatterjee, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:40:45 np0005548918 systemd[1]: Started libpod-conmon-25d9e846377d01fd71f7b5c6a21391e76468b57514087e5c3795181a3506f2dd.scope.
Dec  6 04:40:45 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:40:47 np0005548918 podman[76517]: 2025-12-06 09:40:47.758837041 +0000 UTC m=+2.153165286 container init 25d9e846377d01fd71f7b5c6a21391e76468b57514087e5c3795181a3506f2dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_chatterjee, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Dec  6 04:40:47 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:47 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:40:47 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:40:47 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:47 np0005548918 ceph-mon[75798]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Dec  6 04:40:47 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:47 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:47 np0005548918 podman[76517]: 2025-12-06 09:40:47.772314702 +0000 UTC m=+2.166642907 container start 25d9e846377d01fd71f7b5c6a21391e76468b57514087e5c3795181a3506f2dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_chatterjee, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Dec  6 04:40:47 np0005548918 podman[76517]: 2025-12-06 09:40:47.776058921 +0000 UTC m=+2.170387126 container attach 25d9e846377d01fd71f7b5c6a21391e76468b57514087e5c3795181a3506f2dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_chatterjee, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec  6 04:40:47 np0005548918 systemd[1]: libpod-25d9e846377d01fd71f7b5c6a21391e76468b57514087e5c3795181a3506f2dd.scope: Deactivated successfully.
Dec  6 04:40:47 np0005548918 suspicious_chatterjee[76533]: 167 167
Dec  6 04:40:47 np0005548918 podman[76517]: 2025-12-06 09:40:47.779884424 +0000 UTC m=+2.174212629 container died 25d9e846377d01fd71f7b5c6a21391e76468b57514087e5c3795181a3506f2dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_chatterjee, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:40:47 np0005548918 conmon[76533]: conmon 25d9e846377d01fd71f7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-25d9e846377d01fd71f7b5c6a21391e76468b57514087e5c3795181a3506f2dd.scope/container/memory.events
Dec  6 04:40:47 np0005548918 systemd[1]: var-lib-containers-storage-overlay-7cfa998b4e1d49b77f48971621d9258fec30dcdaaea38c028edc9c71af27444e-merged.mount: Deactivated successfully.
Dec  6 04:40:47 np0005548918 podman[76517]: 2025-12-06 09:40:47.824968025 +0000 UTC m=+2.219296240 container remove 25d9e846377d01fd71f7b5c6a21391e76468b57514087e5c3795181a3506f2dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_chatterjee, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True)
Dec  6 04:40:47 np0005548918 systemd[1]: libpod-conmon-25d9e846377d01fd71f7b5c6a21391e76468b57514087e5c3795181a3506f2dd.scope: Deactivated successfully.
Dec  6 04:40:48 np0005548918 podman[76557]: 2025-12-06 09:40:48.030690151 +0000 UTC m=+0.049561106 container create 48189a8bb5624857694e1303043d39f173edceadeec90990079589abf72048cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_bouman, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True)
Dec  6 04:40:48 np0005548918 systemd[1]: Started libpod-conmon-48189a8bb5624857694e1303043d39f173edceadeec90990079589abf72048cc.scope.
Dec  6 04:40:48 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:40:48 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbd71b9af36a9726938959c5c1cccbe42a6d750904615a06f032498677c64682/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:48 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbd71b9af36a9726938959c5c1cccbe42a6d750904615a06f032498677c64682/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:48 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbd71b9af36a9726938959c5c1cccbe42a6d750904615a06f032498677c64682/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:48 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbd71b9af36a9726938959c5c1cccbe42a6d750904615a06f032498677c64682/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:48 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbd71b9af36a9726938959c5c1cccbe42a6d750904615a06f032498677c64682/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:48 np0005548918 podman[76557]: 2025-12-06 09:40:48.008602745 +0000 UTC m=+0.027473710 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:48 np0005548918 podman[76557]: 2025-12-06 09:40:48.110958958 +0000 UTC m=+0.129829983 container init 48189a8bb5624857694e1303043d39f173edceadeec90990079589abf72048cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_bouman, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid)
Dec  6 04:40:48 np0005548918 podman[76557]: 2025-12-06 09:40:48.12041836 +0000 UTC m=+0.139289325 container start 48189a8bb5624857694e1303043d39f173edceadeec90990079589abf72048cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_bouman, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Dec  6 04:40:48 np0005548918 podman[76557]: 2025-12-06 09:40:48.126244966 +0000 UTC m=+0.145116081 container attach 48189a8bb5624857694e1303043d39f173edceadeec90990079589abf72048cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_bouman, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec  6 04:40:48 np0005548918 intelligent_bouman[76573]: --> passed data devices: 0 physical, 1 LVM
Dec  6 04:40:48 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:40:48 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:40:48 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new b46cc65b-25ba-490a-8b8e-91e4407f3aed
Dec  6 04:40:48 np0005548918 ceph-mon[75798]: Saving service ingress.rgw.default spec with placement count:2
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "b46cc65b-25ba-490a-8b8e-91e4407f3aed"} v 0)
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/569971095' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "b46cc65b-25ba-490a-8b8e-91e4407f3aed"}]: dispatch
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e31 e31: 3 total, 2 up, 3 in
Dec  6 04:40:49 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Dec  6 04:40:49 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Dec  6 04:40:49 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  6 04:40:49 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec  6 04:40:49 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Dec  6 04:40:49 np0005548918 lvm[76634]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 04:40:49 np0005548918 lvm[76634]: VG ceph_vg0 finished
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0)
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3771187413' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Dec  6 04:40:49 np0005548918 intelligent_bouman[76573]: stderr: got monmap epoch 3
Dec  6 04:40:49 np0005548918 intelligent_bouman[76573]: --> Creating keyring file for osd.2
Dec  6 04:40:49 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Dec  6 04:40:49 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Dec  6 04:40:49 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid b46cc65b-25ba-490a-8b8e-91e4407f3aed --setuser ceph --setgroup ceph
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: Saving service node-exporter spec with placement *
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: Saving service grafana spec with placement compute-0;count:1
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: Saving service prometheus spec with placement compute-0;count:1
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: Saving service alertmanager spec with placement compute-0;count:1
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "b46cc65b-25ba-490a-8b8e-91e4407f3aed"}]: dispatch
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.102:0/569971095' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "b46cc65b-25ba-490a-8b8e-91e4407f3aed"}]: dispatch
Dec  6 04:40:49 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "b46cc65b-25ba-490a-8b8e-91e4407f3aed"}]': finished
Dec  6 04:40:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:40:51 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/4267326554' entity='client.admin' 
Dec  6 04:40:52 np0005548918 intelligent_bouman[76573]: stderr: 2025-12-06T09:40:49.765+0000 7efd43dc2740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) No valid bdev label found
Dec  6 04:40:52 np0005548918 intelligent_bouman[76573]: stderr: 2025-12-06T09:40:50.027+0000 7efd43dc2740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Dec  6 04:40:52 np0005548918 intelligent_bouman[76573]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Dec  6 04:40:52 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:52 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:52 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/821839877' entity='client.admin' 
Dec  6 04:40:52 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  6 04:40:52 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec  6 04:40:53 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec  6 04:40:53 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec  6 04:40:53 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  6 04:40:53 np0005548918 intelligent_bouman[76573]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  6 04:40:53 np0005548918 intelligent_bouman[76573]: --> ceph-volume lvm activate successful for osd ID: 2
Dec  6 04:40:53 np0005548918 intelligent_bouman[76573]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Dec  6 04:40:53 np0005548918 systemd[1]: libpod-48189a8bb5624857694e1303043d39f173edceadeec90990079589abf72048cc.scope: Deactivated successfully.
Dec  6 04:40:53 np0005548918 systemd[1]: libpod-48189a8bb5624857694e1303043d39f173edceadeec90990079589abf72048cc.scope: Consumed 2.152s CPU time.
Dec  6 04:40:53 np0005548918 podman[77549]: 2025-12-06 09:40:53.255329396 +0000 UTC m=+0.028554190 container died 48189a8bb5624857694e1303043d39f173edceadeec90990079589abf72048cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_bouman, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec  6 04:40:53 np0005548918 systemd[1]: var-lib-containers-storage-overlay-dbd71b9af36a9726938959c5c1cccbe42a6d750904615a06f032498677c64682-merged.mount: Deactivated successfully.
Dec  6 04:40:53 np0005548918 podman[77549]: 2025-12-06 09:40:53.298620203 +0000 UTC m=+0.071844987 container remove 48189a8bb5624857694e1303043d39f173edceadeec90990079589abf72048cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_bouman, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec  6 04:40:53 np0005548918 systemd[1]: libpod-conmon-48189a8bb5624857694e1303043d39f173edceadeec90990079589abf72048cc.scope: Deactivated successfully.
Dec  6 04:40:53 np0005548918 podman[77655]: 2025-12-06 09:40:53.923903581 +0000 UTC m=+0.050376308 container create c6895bec3692698f4896875844996082bd6ff250e5417dadfad7cfe2ed9e1c3a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_noether, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:40:53 np0005548918 systemd[1]: Started libpod-conmon-c6895bec3692698f4896875844996082bd6ff250e5417dadfad7cfe2ed9e1c3a.scope.
Dec  6 04:40:54 np0005548918 podman[77655]: 2025-12-06 09:40:53.902486814 +0000 UTC m=+0.028959501 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:54 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:40:54 np0005548918 podman[77655]: 2025-12-06 09:40:54.017626356 +0000 UTC m=+0.144099123 container init c6895bec3692698f4896875844996082bd6ff250e5417dadfad7cfe2ed9e1c3a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_noether, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Dec  6 04:40:54 np0005548918 podman[77655]: 2025-12-06 09:40:54.026122555 +0000 UTC m=+0.152595232 container start c6895bec3692698f4896875844996082bd6ff250e5417dadfad7cfe2ed9e1c3a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_noether, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:40:54 np0005548918 podman[77655]: 2025-12-06 09:40:54.030164014 +0000 UTC m=+0.156636791 container attach c6895bec3692698f4896875844996082bd6ff250e5417dadfad7cfe2ed9e1c3a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_noether, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:40:54 np0005548918 goofy_noether[77672]: 167 167
Dec  6 04:40:54 np0005548918 systemd[1]: libpod-c6895bec3692698f4896875844996082bd6ff250e5417dadfad7cfe2ed9e1c3a.scope: Deactivated successfully.
Dec  6 04:40:54 np0005548918 podman[77655]: 2025-12-06 09:40:54.035744985 +0000 UTC m=+0.162217702 container died c6895bec3692698f4896875844996082bd6ff250e5417dadfad7cfe2ed9e1c3a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_noether, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:40:54 np0005548918 systemd[1]: var-lib-containers-storage-overlay-759a6718034cc98b1caa2323ece8a040a6144911a882c0747d9d5572cd86f16a-merged.mount: Deactivated successfully.
Dec  6 04:40:54 np0005548918 podman[77655]: 2025-12-06 09:40:54.083338477 +0000 UTC m=+0.209811194 container remove c6895bec3692698f4896875844996082bd6ff250e5417dadfad7cfe2ed9e1c3a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_noether, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Dec  6 04:40:54 np0005548918 systemd[1]: libpod-conmon-c6895bec3692698f4896875844996082bd6ff250e5417dadfad7cfe2ed9e1c3a.scope: Deactivated successfully.
Dec  6 04:40:54 np0005548918 podman[77697]: 2025-12-06 09:40:54.253282896 +0000 UTC m=+0.034952903 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:54 np0005548918 podman[77697]: 2025-12-06 09:40:54.385092858 +0000 UTC m=+0.166762875 container create fcacdc57db070bcc5162fab939f7afc88d658cc48545b66712000124423a45ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_agnesi, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Dec  6 04:40:54 np0005548918 systemd[1]: Started libpod-conmon-fcacdc57db070bcc5162fab939f7afc88d658cc48545b66712000124423a45ab.scope.
Dec  6 04:40:54 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:40:54 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/862f6af2dd9c03a48e7eab5627e0c475c47c2246ad33c9b919901b24f2c9bb82/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:54 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/862f6af2dd9c03a48e7eab5627e0c475c47c2246ad33c9b919901b24f2c9bb82/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:54 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/862f6af2dd9c03a48e7eab5627e0c475c47c2246ad33c9b919901b24f2c9bb82/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:54 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/862f6af2dd9c03a48e7eab5627e0c475c47c2246ad33c9b919901b24f2c9bb82/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:54 np0005548918 podman[77697]: 2025-12-06 09:40:54.489622095 +0000 UTC m=+0.271292162 container init fcacdc57db070bcc5162fab939f7afc88d658cc48545b66712000124423a45ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_agnesi, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:40:54 np0005548918 podman[77697]: 2025-12-06 09:40:54.500874158 +0000 UTC m=+0.282544155 container start fcacdc57db070bcc5162fab939f7afc88d658cc48545b66712000124423a45ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_agnesi, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  6 04:40:54 np0005548918 podman[77697]: 2025-12-06 09:40:54.505108651 +0000 UTC m=+0.286778678 container attach fcacdc57db070bcc5162fab939f7afc88d658cc48545b66712000124423a45ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_agnesi, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]: {
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:    "2": [
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:        {
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:            "devices": [
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:                "/dev/loop3"
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:            ],
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:            "lv_name": "ceph_lv0",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:            "lv_size": "21470642176",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=WVKHvl-noNQ-QnRO-RsQD-VZwj-ky5X-QmffVX,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5ecd3f74-dade-5fc4-92ce-8950ae424258,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b46cc65b-25ba-490a-8b8e-91e4407f3aed,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:            "lv_uuid": "WVKHvl-noNQ-QnRO-RsQD-VZwj-ky5X-QmffVX",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:            "name": "ceph_lv0",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:            "tags": {
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:                "ceph.block_uuid": "WVKHvl-noNQ-QnRO-RsQD-VZwj-ky5X-QmffVX",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:                "ceph.cephx_lockbox_secret": "",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:                "ceph.cluster_fsid": "5ecd3f74-dade-5fc4-92ce-8950ae424258",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:                "ceph.cluster_name": "ceph",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:                "ceph.crush_device_class": "",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:                "ceph.encrypted": "0",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:                "ceph.osd_fsid": "b46cc65b-25ba-490a-8b8e-91e4407f3aed",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:                "ceph.osd_id": "2",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:                "ceph.type": "block",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:                "ceph.vdo": "0",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:                "ceph.with_tpm": "0"
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:            },
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:            "type": "block",
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:            "vg_name": "ceph_vg0"
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:        }
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]:    ]
Dec  6 04:40:54 np0005548918 modest_agnesi[77714]: }
Dec  6 04:40:54 np0005548918 systemd[1]: libpod-fcacdc57db070bcc5162fab939f7afc88d658cc48545b66712000124423a45ab.scope: Deactivated successfully.
Dec  6 04:40:54 np0005548918 podman[77697]: 2025-12-06 09:40:54.791228141 +0000 UTC m=+0.572898148 container died fcacdc57db070bcc5162fab939f7afc88d658cc48545b66712000124423a45ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_agnesi, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Dec  6 04:40:54 np0005548918 systemd[1]: var-lib-containers-storage-overlay-862f6af2dd9c03a48e7eab5627e0c475c47c2246ad33c9b919901b24f2c9bb82-merged.mount: Deactivated successfully.
Dec  6 04:40:54 np0005548918 podman[77697]: 2025-12-06 09:40:54.832785911 +0000 UTC m=+0.614455908 container remove fcacdc57db070bcc5162fab939f7afc88d658cc48545b66712000124423a45ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_agnesi, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:40:54 np0005548918 systemd[1]: libpod-conmon-fcacdc57db070bcc5162fab939f7afc88d658cc48545b66712000124423a45ab.scope: Deactivated successfully.
Dec  6 04:40:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:40:55 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/1482144347' entity='client.admin' 
Dec  6 04:40:55 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Dec  6 04:40:55 np0005548918 podman[77826]: 2025-12-06 09:40:55.441785051 +0000 UTC m=+0.058265651 container create d96963d643c1726e407bf40725ea6b4bce7464ec314836d188530cfdd5da8e87 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_khorana, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:40:55 np0005548918 systemd[1]: Started libpod-conmon-d96963d643c1726e407bf40725ea6b4bce7464ec314836d188530cfdd5da8e87.scope.
Dec  6 04:40:55 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:40:55 np0005548918 podman[77826]: 2025-12-06 09:40:55.40614961 +0000 UTC m=+0.022630230 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:55 np0005548918 podman[77826]: 2025-12-06 09:40:55.505820805 +0000 UTC m=+0.122301475 container init d96963d643c1726e407bf40725ea6b4bce7464ec314836d188530cfdd5da8e87 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_khorana, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:40:55 np0005548918 podman[77826]: 2025-12-06 09:40:55.513347369 +0000 UTC m=+0.129827999 container start d96963d643c1726e407bf40725ea6b4bce7464ec314836d188530cfdd5da8e87 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_khorana, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:40:55 np0005548918 eager_khorana[77842]: 167 167
Dec  6 04:40:55 np0005548918 podman[77826]: 2025-12-06 09:40:55.517009247 +0000 UTC m=+0.133489937 container attach d96963d643c1726e407bf40725ea6b4bce7464ec314836d188530cfdd5da8e87 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_khorana, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Dec  6 04:40:55 np0005548918 systemd[1]: libpod-d96963d643c1726e407bf40725ea6b4bce7464ec314836d188530cfdd5da8e87.scope: Deactivated successfully.
Dec  6 04:40:55 np0005548918 podman[77826]: 2025-12-06 09:40:55.519075843 +0000 UTC m=+0.135556433 container died d96963d643c1726e407bf40725ea6b4bce7464ec314836d188530cfdd5da8e87 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_khorana, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:40:55 np0005548918 systemd[1]: var-lib-containers-storage-overlay-f83b0a7b8e6652b6cf62421150c681230e64d3b2a57ea0ffc970e15def6704a9-merged.mount: Deactivated successfully.
Dec  6 04:40:55 np0005548918 podman[77826]: 2025-12-06 09:40:55.547773456 +0000 UTC m=+0.164254056 container remove d96963d643c1726e407bf40725ea6b4bce7464ec314836d188530cfdd5da8e87 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_khorana, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:40:55 np0005548918 systemd[1]: libpod-conmon-d96963d643c1726e407bf40725ea6b4bce7464ec314836d188530cfdd5da8e87.scope: Deactivated successfully.
Dec  6 04:40:55 np0005548918 podman[77872]: 2025-12-06 09:40:55.928832214 +0000 UTC m=+0.055411953 container create af498d3e35244dd7fea27869b3ac278dec3185c96f7bb1d219e2a37b82943f82 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate-test, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:40:55 np0005548918 systemd[1]: Started libpod-conmon-af498d3e35244dd7fea27869b3ac278dec3185c96f7bb1d219e2a37b82943f82.scope.
Dec  6 04:40:55 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:40:55 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86af1240680e8a134b9c3ae5c871c2c89bc7f46423e5a2a75cf983d86c3dcfef/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:55 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86af1240680e8a134b9c3ae5c871c2c89bc7f46423e5a2a75cf983d86c3dcfef/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:55 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86af1240680e8a134b9c3ae5c871c2c89bc7f46423e5a2a75cf983d86c3dcfef/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:55 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86af1240680e8a134b9c3ae5c871c2c89bc7f46423e5a2a75cf983d86c3dcfef/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:55 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86af1240680e8a134b9c3ae5c871c2c89bc7f46423e5a2a75cf983d86c3dcfef/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:56 np0005548918 podman[77872]: 2025-12-06 09:40:55.913811019 +0000 UTC m=+0.040390778 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:56 np0005548918 podman[77872]: 2025-12-06 09:40:56.012666173 +0000 UTC m=+0.139245992 container init af498d3e35244dd7fea27869b3ac278dec3185c96f7bb1d219e2a37b82943f82 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate-test, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec  6 04:40:56 np0005548918 podman[77872]: 2025-12-06 09:40:56.023023512 +0000 UTC m=+0.149603251 container start af498d3e35244dd7fea27869b3ac278dec3185c96f7bb1d219e2a37b82943f82 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate-test, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Dec  6 04:40:56 np0005548918 podman[77872]: 2025-12-06 09:40:56.026075354 +0000 UTC m=+0.152655123 container attach af498d3e35244dd7fea27869b3ac278dec3185c96f7bb1d219e2a37b82943f82 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate-test, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:40:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate-test[77889]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Dec  6 04:40:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate-test[77889]:                            [--no-systemd] [--no-tmpfs]
Dec  6 04:40:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate-test[77889]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec  6 04:40:56 np0005548918 systemd[1]: libpod-af498d3e35244dd7fea27869b3ac278dec3185c96f7bb1d219e2a37b82943f82.scope: Deactivated successfully.
Dec  6 04:40:56 np0005548918 podman[77872]: 2025-12-06 09:40:56.204277146 +0000 UTC m=+0.330856915 container died af498d3e35244dd7fea27869b3ac278dec3185c96f7bb1d219e2a37b82943f82 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate-test, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  6 04:40:56 np0005548918 systemd[1]: var-lib-containers-storage-overlay-86af1240680e8a134b9c3ae5c871c2c89bc7f46423e5a2a75cf983d86c3dcfef-merged.mount: Deactivated successfully.
Dec  6 04:40:56 np0005548918 podman[77872]: 2025-12-06 09:40:56.246151214 +0000 UTC m=+0.372730943 container remove af498d3e35244dd7fea27869b3ac278dec3185c96f7bb1d219e2a37b82943f82 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate-test, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Dec  6 04:40:56 np0005548918 systemd[1]: libpod-conmon-af498d3e35244dd7fea27869b3ac278dec3185c96f7bb1d219e2a37b82943f82.scope: Deactivated successfully.
Dec  6 04:40:56 np0005548918 ceph-mon[75798]: Deploying daemon osd.2 on compute-2
Dec  6 04:40:56 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/3512142115' entity='client.admin' 
Dec  6 04:40:56 np0005548918 systemd[1]: Reloading.
Dec  6 04:40:56 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:56 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:56 np0005548918 systemd[1]: Reloading.
Dec  6 04:40:56 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:40:56 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:40:57 np0005548918 systemd[1]: Starting Ceph osd.2 for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:40:57 np0005548918 podman[78048]: 2025-12-06 09:40:57.375955787 +0000 UTC m=+0.049177655 container create 586e4ec41cdd48cbc7b335dca53a1c9ee0f0777339997d1e9b3859a3c4eb0eea (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Dec  6 04:40:57 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:40:57 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7989692c14b2f5e00d4b7a58de145fbcc1d623836df3e2f1708bbe11b065385/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:57 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7989692c14b2f5e00d4b7a58de145fbcc1d623836df3e2f1708bbe11b065385/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:57 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7989692c14b2f5e00d4b7a58de145fbcc1d623836df3e2f1708bbe11b065385/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:57 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7989692c14b2f5e00d4b7a58de145fbcc1d623836df3e2f1708bbe11b065385/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:57 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7989692c14b2f5e00d4b7a58de145fbcc1d623836df3e2f1708bbe11b065385/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:57 np0005548918 podman[78048]: 2025-12-06 09:40:57.355045964 +0000 UTC m=+0.028267832 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:57 np0005548918 podman[78048]: 2025-12-06 09:40:57.453891597 +0000 UTC m=+0.127113515 container init 586e4ec41cdd48cbc7b335dca53a1c9ee0f0777339997d1e9b3859a3c4eb0eea (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True)
Dec  6 04:40:57 np0005548918 podman[78048]: 2025-12-06 09:40:57.464995206 +0000 UTC m=+0.138217034 container start 586e4ec41cdd48cbc7b335dca53a1c9ee0f0777339997d1e9b3859a3c4eb0eea (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec  6 04:40:57 np0005548918 podman[78048]: 2025-12-06 09:40:57.468592023 +0000 UTC m=+0.141813941 container attach 586e4ec41cdd48cbc7b335dca53a1c9ee0f0777339997d1e9b3859a3c4eb0eea (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Dec  6 04:40:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate[78064]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:40:57 np0005548918 bash[78048]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:40:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate[78064]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:40:57 np0005548918 bash[78048]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:40:58 np0005548918 lvm[78171]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 04:40:58 np0005548918 lvm[78171]: VG ceph_vg0 finished
Dec  6 04:40:58 np0005548918 python3[78164]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:40:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate[78064]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec  6 04:40:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate[78064]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:40:58 np0005548918 bash[78048]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec  6 04:40:58 np0005548918 bash[78048]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:40:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate[78064]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:40:58 np0005548918 bash[78048]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  6 04:40:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate[78064]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  6 04:40:58 np0005548918 bash[78048]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  6 04:40:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate[78064]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec  6 04:40:58 np0005548918 bash[78048]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec  6 04:40:58 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/2451230512' entity='client.admin' 
Dec  6 04:40:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate[78064]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec  6 04:40:58 np0005548918 bash[78048]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec  6 04:40:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate[78064]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec  6 04:40:58 np0005548918 bash[78048]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec  6 04:40:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate[78064]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  6 04:40:58 np0005548918 bash[78048]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  6 04:40:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate[78064]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  6 04:40:58 np0005548918 bash[78048]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  6 04:40:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate[78064]: --> ceph-volume lvm activate successful for osd ID: 2
Dec  6 04:40:58 np0005548918 bash[78048]: --> ceph-volume lvm activate successful for osd ID: 2
Dec  6 04:40:58 np0005548918 systemd[1]: libpod-586e4ec41cdd48cbc7b335dca53a1c9ee0f0777339997d1e9b3859a3c4eb0eea.scope: Deactivated successfully.
Dec  6 04:40:58 np0005548918 podman[78048]: 2025-12-06 09:40:58.743132877 +0000 UTC m=+1.416354745 container died 586e4ec41cdd48cbc7b335dca53a1c9ee0f0777339997d1e9b3859a3c4eb0eea (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:40:58 np0005548918 systemd[1]: libpod-586e4ec41cdd48cbc7b335dca53a1c9ee0f0777339997d1e9b3859a3c4eb0eea.scope: Consumed 1.346s CPU time.
Dec  6 04:40:58 np0005548918 systemd[1]: var-lib-containers-storage-overlay-f7989692c14b2f5e00d4b7a58de145fbcc1d623836df3e2f1708bbe11b065385-merged.mount: Deactivated successfully.
Dec  6 04:40:58 np0005548918 podman[78048]: 2025-12-06 09:40:58.863551711 +0000 UTC m=+1.536773569 container remove 586e4ec41cdd48cbc7b335dca53a1c9ee0f0777339997d1e9b3859a3c4eb0eea (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2-activate, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:40:59 np0005548918 podman[78357]: 2025-12-06 09:40:59.12481844 +0000 UTC m=+0.066653186 container create 446ec9caaae7f639d28b1738673667de2d277b2098f08fc0d6ccaa63e4fb29ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Dec  6 04:40:59 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b3df05b4cf7af16f875236002e5f7f89c397c9a89199c4d4ec8059fcb73517c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:59 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b3df05b4cf7af16f875236002e5f7f89c397c9a89199c4d4ec8059fcb73517c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:59 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b3df05b4cf7af16f875236002e5f7f89c397c9a89199c4d4ec8059fcb73517c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:59 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b3df05b4cf7af16f875236002e5f7f89c397c9a89199c4d4ec8059fcb73517c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:59 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b3df05b4cf7af16f875236002e5f7f89c397c9a89199c4d4ec8059fcb73517c/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec  6 04:40:59 np0005548918 podman[78357]: 2025-12-06 09:40:59.097133134 +0000 UTC m=+0.038967920 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:59 np0005548918 podman[78357]: 2025-12-06 09:40:59.196080021 +0000 UTC m=+0.137914817 container init 446ec9caaae7f639d28b1738673667de2d277b2098f08fc0d6ccaa63e4fb29ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:40:59 np0005548918 podman[78357]: 2025-12-06 09:40:59.201173968 +0000 UTC m=+0.143008704 container start 446ec9caaae7f639d28b1738673667de2d277b2098f08fc0d6ccaa63e4fb29ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Dec  6 04:40:59 np0005548918 bash[78357]: 446ec9caaae7f639d28b1738673667de2d277b2098f08fc0d6ccaa63e4fb29ac
Dec  6 04:40:59 np0005548918 systemd[1]: Started Ceph osd.2 for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: set uid:gid to 167:167 (ceph:ceph)
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: pidfile_write: ignore empty --pid-file
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  6 04:40:59 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:59 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/2111286861' entity='client.admin' 
Dec  6 04:40:59 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:40:59 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  6 04:40:59 np0005548918 podman[78478]: 2025-12-06 09:40:59.892399193 +0000 UTC m=+0.038816827 container create 398bdc19c13e56ede90c39f721ec9e2bc75f36243a8f4ada0fb5280c826be8a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_galois, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:40:59 np0005548918 systemd[1]: Started libpod-conmon-398bdc19c13e56ede90c39f721ec9e2bc75f36243a8f4ada0fb5280c826be8a6.scope.
Dec  6 04:40:59 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:40:59 np0005548918 podman[78478]: 2025-12-06 09:40:59.876607608 +0000 UTC m=+0.023025262 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:40:59 np0005548918 podman[78478]: 2025-12-06 09:40:59.99395922 +0000 UTC m=+0.140376864 container init 398bdc19c13e56ede90c39f721ec9e2bc75f36243a8f4ada0fb5280c826be8a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_galois, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:41:00 np0005548918 podman[78478]: 2025-12-06 09:41:00.000089895 +0000 UTC m=+0.146507549 container start 398bdc19c13e56ede90c39f721ec9e2bc75f36243a8f4ada0fb5280c826be8a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_galois, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Dec  6 04:41:00 np0005548918 podman[78478]: 2025-12-06 09:41:00.003682572 +0000 UTC m=+0.150100196 container attach 398bdc19c13e56ede90c39f721ec9e2bc75f36243a8f4ada0fb5280c826be8a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_galois, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True)
Dec  6 04:41:00 np0005548918 festive_galois[78496]: 167 167
Dec  6 04:41:00 np0005548918 systemd[1]: libpod-398bdc19c13e56ede90c39f721ec9e2bc75f36243a8f4ada0fb5280c826be8a6.scope: Deactivated successfully.
Dec  6 04:41:00 np0005548918 conmon[78496]: conmon 398bdc19c13e56ede90c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-398bdc19c13e56ede90c39f721ec9e2bc75f36243a8f4ada0fb5280c826be8a6.scope/container/memory.events
Dec  6 04:41:00 np0005548918 podman[78478]: 2025-12-06 09:41:00.006764175 +0000 UTC m=+0.153181789 container died 398bdc19c13e56ede90c39f721ec9e2bc75f36243a8f4ada0fb5280c826be8a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_galois, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:41:00 np0005548918 systemd[1]: var-lib-containers-storage-overlay-765b02de0fd931d2d9e7d6de27543a2263181518449bfb927cbba3fbb0b33b2f-merged.mount: Deactivated successfully.
Dec  6 04:41:00 np0005548918 podman[78478]: 2025-12-06 09:41:00.046388762 +0000 UTC m=+0.192806406 container remove 398bdc19c13e56ede90c39f721ec9e2bc75f36243a8f4ada0fb5280c826be8a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_galois, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Dec  6 04:41:00 np0005548918 systemd[1]: libpod-conmon-398bdc19c13e56ede90c39f721ec9e2bc75f36243a8f4ada0fb5280c826be8a6.scope: Deactivated successfully.
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  6 04:41:00 np0005548918 podman[78521]: 2025-12-06 09:41:00.19424606 +0000 UTC m=+0.037592772 container create b26a90ec6debb696a0f70f32e444ec900f3d42c1293c752554b9b580d1364003 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_vaughan, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Dec  6 04:41:00 np0005548918 podman[78521]: 2025-12-06 09:41:00.1758274 +0000 UTC m=+0.019174122 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:41:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5800 /var/lib/ceph/osd/ceph-2/block) close
Dec  6 04:41:00 np0005548918 systemd[1]: Started libpod-conmon-b26a90ec6debb696a0f70f32e444ec900f3d42c1293c752554b9b580d1364003.scope.
Dec  6 04:41:00 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:41:00 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbda76d03270cdeb65a40e640726d1fa8ea688afa5b492f4625b758db004bfc1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:00 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbda76d03270cdeb65a40e640726d1fa8ea688afa5b492f4625b758db004bfc1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:00 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbda76d03270cdeb65a40e640726d1fa8ea688afa5b492f4625b758db004bfc1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:00 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbda76d03270cdeb65a40e640726d1fa8ea688afa5b492f4625b758db004bfc1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:00 np0005548918 podman[78521]: 2025-12-06 09:41:00.519758929 +0000 UTC m=+0.363105651 container init b26a90ec6debb696a0f70f32e444ec900f3d42c1293c752554b9b580d1364003 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_vaughan, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Dec  6 04:41:00 np0005548918 podman[78521]: 2025-12-06 09:41:00.537537653 +0000 UTC m=+0.380884385 container start b26a90ec6debb696a0f70f32e444ec900f3d42c1293c752554b9b580d1364003 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_vaughan, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:41:00 np0005548918 podman[78521]: 2025-12-06 09:41:00.542851738 +0000 UTC m=+0.386198480 container attach b26a90ec6debb696a0f70f32e444ec900f3d42c1293c752554b9b580d1364003 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_vaughan, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2)
Dec  6 04:41:00 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/2854219236' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c47d5c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: load: jerasure load: lrc 
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  6 04:41:00 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) close
Dec  6 04:41:01 np0005548918 lvm[78620]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 04:41:01 np0005548918 lvm[78620]: VG ceph_vg0 finished
Dec  6 04:41:01 np0005548918 kind_vaughan[78540]: {}
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) close
Dec  6 04:41:01 np0005548918 systemd[1]: libpod-b26a90ec6debb696a0f70f32e444ec900f3d42c1293c752554b9b580d1364003.scope: Deactivated successfully.
Dec  6 04:41:01 np0005548918 systemd[1]: libpod-b26a90ec6debb696a0f70f32e444ec900f3d42c1293c752554b9b580d1364003.scope: Consumed 1.049s CPU time.
Dec  6 04:41:01 np0005548918 podman[78629]: 2025-12-06 09:41:01.285397219 +0000 UTC m=+0.022304207 container died b26a90ec6debb696a0f70f32e444ec900f3d42c1293c752554b9b580d1364003 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_vaughan, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:41:01 np0005548918 systemd[1]: var-lib-containers-storage-overlay-cbda76d03270cdeb65a40e640726d1fa8ea688afa5b492f4625b758db004bfc1-merged.mount: Deactivated successfully.
Dec  6 04:41:01 np0005548918 podman[78629]: 2025-12-06 09:41:01.319334661 +0000 UTC m=+0.056241659 container remove b26a90ec6debb696a0f70f32e444ec900f3d42c1293c752554b9b580d1364003 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_vaughan, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  6 04:41:01 np0005548918 systemd[1]: libpod-conmon-b26a90ec6debb696a0f70f32e444ec900f3d42c1293c752554b9b580d1364003.scope: Deactivated successfully.
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) close
Dec  6 04:41:01 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/2854219236' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Dec  6 04:41:01 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:01 np0005548918 ceph-mon[75798]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:41:01 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) close
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) close
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564ac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564b000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564b000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564b000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564b000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs mount
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs mount shared_bdev_used = 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: RocksDB version: 7.9.2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Git sha 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Compile date 2025-07-17 03:12:14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: DB SUMMARY
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: DB Session ID:  LX9SQ1K5F8KPNVFVH67V
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: CURRENT file:  CURRENT
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: IDENTITY file:  IDENTITY
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                         Options.error_if_exists: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.create_if_missing: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                         Options.paranoid_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                                     Options.env: 0x55f8c4829650
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                                Options.info_log: 0x55f8c564f520
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_file_opening_threads: 16
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                              Options.statistics: (nil)
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.use_fsync: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.max_log_file_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                         Options.allow_fallocate: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.use_direct_reads: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.create_missing_column_families: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                              Options.db_log_dir: 
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                                 Options.wal_dir: db.wal
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.advise_random_on_open: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.write_buffer_manager: 0x55f8c5740a00
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                            Options.rate_limiter: (nil)
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.unordered_write: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.row_cache: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                              Options.wal_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.allow_ingest_behind: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.two_write_queues: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.manual_wal_flush: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.wal_compression: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.atomic_flush: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.log_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.allow_data_in_errors: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.db_host_id: __hostname__
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.max_background_jobs: 4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.max_background_compactions: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.max_subcompactions: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.max_open_files: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.bytes_per_sync: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.max_background_flushes: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Compression algorithms supported:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kZSTD supported: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kXpressCompression supported: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kBZip2Compression supported: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kLZ4Compression supported: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kZlibCompression supported: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kSnappyCompression supported: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c564f8e0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486b350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c564f8e0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486b350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c564f8e0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486b350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c564f8e0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486b350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c564f8e0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486b350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c564f8e0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486b350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c564f8e0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486b350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c564f900)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486a9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c564f900)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486a9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c564f900)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486a9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: fb5b71de-3814-42b9-888c-ff0f06748421
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014062383697, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014062384026, "job": 1, "event": "recovery_finished"}
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: freelist init
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: freelist _read_cfg
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs umount
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564b000 /var/lib/ceph/osd/ceph-2/block) close
Dec  6 04:41:02 np0005548918 podman[78806]: 2025-12-06 09:41:02.394483344 +0000 UTC m=+0.106821795 container exec 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:41:02 np0005548918 podman[78806]: 2025-12-06 09:41:02.500168536 +0000 UTC m=+0.212506987 container exec_died 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564b000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564b000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564b000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bdev(0x55f8c564b000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs mount
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluefs mount shared_bdev_used = 4718592
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: RocksDB version: 7.9.2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Git sha 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Compile date 2025-07-17 03:12:14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: DB SUMMARY
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: DB Session ID:  LX9SQ1K5F8KPNVFVH67U
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: CURRENT file:  CURRENT
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: IDENTITY file:  IDENTITY
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                         Options.error_if_exists: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.create_if_missing: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                         Options.paranoid_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                                     Options.env: 0x55f8c4828770
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                                Options.info_log: 0x55f8c59703c0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_file_opening_threads: 16
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                              Options.statistics: (nil)
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.use_fsync: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.max_log_file_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                         Options.allow_fallocate: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.use_direct_reads: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.create_missing_column_families: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                              Options.db_log_dir: 
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                                 Options.wal_dir: db.wal
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.advise_random_on_open: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.write_buffer_manager: 0x55f8c5740aa0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                            Options.rate_limiter: (nil)
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.unordered_write: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.row_cache: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                              Options.wal_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.allow_ingest_behind: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.two_write_queues: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.manual_wal_flush: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.wal_compression: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.atomic_flush: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.log_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.allow_data_in_errors: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.db_host_id: __hostname__
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.max_background_jobs: 4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.max_background_compactions: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.max_subcompactions: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.max_open_files: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.bytes_per_sync: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.max_background_flushes: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Compression algorithms supported:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kZSTD supported: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kXpressCompression supported: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kBZip2Compression supported: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kLZ4Compression supported: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kZlibCompression supported: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: #011kSnappyCompression supported: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c59aaa20)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486a9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c59aaa20)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486a9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c59aaa20)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486a9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c59aaa20)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486a9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c59aaa20)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486a9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c59aaa20)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486a9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c59aaa20)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486a9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c59aab80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486af30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c59aab80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486af30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:           Options.merge_operator: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.compaction_filter_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.sst_partitioner_factory: None
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f8c59aab80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f8c486af30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.write_buffer_size: 16777216
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.max_write_buffer_number: 64
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.compression: LZ4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.num_levels: 7
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.level: 32767
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.compression_opts.strategy: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                  Options.compression_opts.enabled: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.arena_block_size: 1048576
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.disable_auto_compactions: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.inplace_update_support: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.bloom_locality: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                    Options.max_successive_merges: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.paranoid_file_checks: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.force_consistency_checks: 1
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.report_bg_io_stats: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                               Options.ttl: 2592000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                       Options.enable_blob_files: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                           Options.min_blob_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                          Options.blob_file_size: 268435456
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb:                Options.blob_file_starting_level: 0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: fb5b71de-3814-42b9-888c-ff0f06748421
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014062665885, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014062708707, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014062, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fb5b71de-3814-42b9-888c-ff0f06748421", "db_session_id": "LX9SQ1K5F8KPNVFVH67U", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014062713124, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014062, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fb5b71de-3814-42b9-888c-ff0f06748421", "db_session_id": "LX9SQ1K5F8KPNVFVH67U", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014062717708, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014062, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fb5b71de-3814-42b9-888c-ff0f06748421", "db_session_id": "LX9SQ1K5F8KPNVFVH67U", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014062719507, "job": 1, "event": "recovery_finished"}
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec  6 04:41:02 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/2146703949' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55f8c59afc00
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: DB pointer 0x55f8c598e000
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f8c486a9b0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f8c486a9b0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: _get_class not permitted to load lua
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr respawn  1: '-n'
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr respawn  2: 'mgr.compute-2.oazbvn'
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr respawn  3: '-f'
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr respawn  4: '--setuser'
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr respawn  5: 'ceph'
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr respawn  6: '--setgroup'
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr respawn  7: 'ceph'
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr respawn  8: '--default-log-to-file=false'
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr respawn  9: '--default-log-to-journald=true'
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr respawn  10: '--default-log-to-stderr=false'
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr respawn  exe_path /proc/self/exe
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: _get_class not permitted to load sdk
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: osd.2 0 load_pgs
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: osd.2 0 load_pgs opened 0 pgs
Dec  6 04:41:02 np0005548918 ceph-osd[78376]: osd.2 0 log_to_monitors true
Dec  6 04:41:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2[78372]: 2025-12-06T09:41:02.753+0000 7f20e0d4a740 -1 osd.2 0 log_to_monitors true
Dec  6 04:41:02 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0)
Dec  6 04:41:02 np0005548918 ceph-mon[75798]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/709563040,v1:192.168.122.102:6801/709563040]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec  6 04:41:02 np0005548918 systemd[1]: session-25.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Session 25 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548918 systemd[1]: session-22.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Session 32 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548918 systemd[1]: session-20.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Session 22 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548918 systemd[1]: session-28.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548918 systemd[1]: session-26.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Session 20 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Session 26 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Session 28 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Removed session 25.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Removed session 22.
Dec  6 04:41:02 np0005548918 systemd[1]: session-31.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548918 systemd[1]: session-30.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548918 systemd[1]: session-23.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Session 31 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548918 systemd[1]: session-27.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Session 30 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548918 systemd[1]: session-29.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Session 23 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: ignoring --setuser ceph since I am not root
Dec  6 04:41:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: ignoring --setgroup ceph since I am not root
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Session 27 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Session 29 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Removed session 20.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Removed session 28.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Removed session 26.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Removed session 31.
Dec  6 04:41:02 np0005548918 systemd[1]: session-24.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: pidfile_write: ignore empty --pid-file
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Session 24 logged out. Waiting for processes to exit.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Removed session 30.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Removed session 23.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Removed session 27.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Removed session 29.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Removed session 24.
Dec  6 04:41:02 np0005548918 systemd[1]: session-32.scope: Deactivated successfully.
Dec  6 04:41:02 np0005548918 systemd[1]: session-32.scope: Consumed 55.548s CPU time.
Dec  6 04:41:02 np0005548918 systemd-logind[800]: Removed session 32.
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'alerts'
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:41:02 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'balancer'
Dec  6 04:41:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:02.994+0000 7fbbe11bf140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:41:03 np0005548918 ceph-mgr[76108]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:41:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:03.074+0000 7fbbe11bf140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:41:03 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'cephadm'
Dec  6 04:41:03 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec  6 04:41:03 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec  6 04:41:03 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/2146703949' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Dec  6 04:41:03 np0005548918 ceph-mon[75798]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec  6 04:41:03 np0005548918 ceph-mon[75798]: from='osd.2 [v2:192.168.122.102:6800/709563040,v1:192.168.122.102:6801/709563040]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec  6 04:41:03 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e32 e32: 3 total, 2 up, 3 in
Dec  6 04:41:03 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]} v 0)
Dec  6 04:41:03 np0005548918 ceph-mon[75798]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/709563040,v1:192.168.122.102:6801/709563040]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec  6 04:41:03 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'crash'
Dec  6 04:41:03 np0005548918 ceph-mgr[76108]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:41:03 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'dashboard'
Dec  6 04:41:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:03.873+0000 7fbbe11bf140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'devicehealth'
Dec  6 04:41:04 np0005548918 ceph-mgr[76108]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'diskprediction_local'
Dec  6 04:41:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:04.498+0000 7fbbe11bf140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  6 04:41:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  6 04:41:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]:  from numpy import show_config as show_numpy_config
Dec  6 04:41:04 np0005548918 ceph-mgr[76108]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'influx'
Dec  6 04:41:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:04.663+0000 7fbbe11bf140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548918 ceph-mgr[76108]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'insights'
Dec  6 04:41:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:04.737+0000 7fbbe11bf140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e33 e33: 3 total, 2 up, 3 in
Dec  6 04:41:04 np0005548918 ceph-osd[78376]: osd.2 0 done with init, starting boot process
Dec  6 04:41:04 np0005548918 ceph-osd[78376]: osd.2 0 start_boot
Dec  6 04:41:04 np0005548918 ceph-osd[78376]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec  6 04:41:04 np0005548918 ceph-osd[78376]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec  6 04:41:04 np0005548918 ceph-osd[78376]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec  6 04:41:04 np0005548918 ceph-osd[78376]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec  6 04:41:04 np0005548918 ceph-osd[78376]: osd.2 0  bench count 12288000 bsize 4 KiB
Dec  6 04:41:04 np0005548918 ceph-mon[75798]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Dec  6 04:41:04 np0005548918 ceph-mon[75798]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec  6 04:41:04 np0005548918 ceph-mon[75798]: from='osd.2 [v2:192.168.122.102:6800/709563040,v1:192.168.122.102:6801/709563040]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec  6 04:41:04 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'iostat'
Dec  6 04:41:04 np0005548918 ceph-mgr[76108]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:41:04 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'k8sevents'
Dec  6 04:41:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:04.875+0000 7fbbe11bf140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:41:05 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'localpool'
Dec  6 04:41:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:05 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'mds_autoscaler'
Dec  6 04:41:05 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'mirroring'
Dec  6 04:41:05 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'nfs'
Dec  6 04:41:05 np0005548918 ceph-mgr[76108]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:41:05 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'orchestrator'
Dec  6 04:41:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:05.856+0000 7fbbe11bf140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:41:05 np0005548918 ceph-mon[75798]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Dec  6 04:41:06 np0005548918 ceph-mgr[76108]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'osd_perf_query'
Dec  6 04:41:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:06.080+0000 7fbbe11bf140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548918 ceph-mgr[76108]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'osd_support'
Dec  6 04:41:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:06.152+0000 7fbbe11bf140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548918 ceph-mgr[76108]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'pg_autoscaler'
Dec  6 04:41:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:06.220+0000 7fbbe11bf140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548918 ceph-mgr[76108]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'progress'
Dec  6 04:41:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:06.299+0000 7fbbe11bf140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548918 ceph-mgr[76108]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'prometheus'
Dec  6 04:41:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:06.376+0000 7fbbe11bf140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548918 ceph-mgr[76108]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rbd_support'
Dec  6 04:41:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:06.711+0000 7fbbe11bf140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548918 ceph-mgr[76108]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:41:06 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'restful'
Dec  6 04:41:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:06.803+0000 7fbbe11bf140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rgw'
Dec  6 04:41:07 np0005548918 ceph-mgr[76108]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rook'
Dec  6 04:41:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:07.214+0000 7fbbe11bf140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548918 ceph-mgr[76108]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:07.769+0000 7fbbe11bf140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'selftest'
Dec  6 04:41:07 np0005548918 ceph-mgr[76108]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:07.835+0000 7fbbe11bf140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'snap_schedule'
Dec  6 04:41:07 np0005548918 ceph-mgr[76108]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'stats'
Dec  6 04:41:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:07.914+0000 7fbbe11bf140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:41:07 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'status'
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'telegraf'
Dec  6 04:41:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:08.062+0000 7fbbe11bf140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'telemetry'
Dec  6 04:41:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:08.136+0000 7fbbe11bf140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'test_orchestrator'
Dec  6 04:41:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:08.297+0000 7fbbe11bf140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:08.531+0000 7fbbe11bf140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'volumes'
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'zabbix'
Dec  6 04:41:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:08.814+0000 7fbbe11bf140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:08.883+0000 7fbbe11bf140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: mgr load Constructed class from module: dashboard
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: [dashboard INFO root] Starting engine...
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: ms_deliver_dispatch: unhandled message 0x55e1d3a836c0 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Dec  6 04:41:08 np0005548918 ceph-mgr[76108]: [dashboard INFO root] Engine started...
Dec  6 04:41:09 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e34 e34: 3 total, 2 up, 3 in
Dec  6 04:41:09 np0005548918 systemd-logind[800]: New session 33 of user ceph-admin.
Dec  6 04:41:09 np0005548918 systemd[1]: Started Session 33 of User ceph-admin.
Dec  6 04:41:10 np0005548918 ceph-mon[75798]: Active manager daemon compute-0.qhdjwa restarted
Dec  6 04:41:10 np0005548918 ceph-mon[75798]: Activating manager daemon compute-0.qhdjwa
Dec  6 04:41:10 np0005548918 ceph-mon[75798]: Manager daemon compute-0.qhdjwa is now available
Dec  6 04:41:10 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/mirror_snapshot_schedule"}]: dispatch
Dec  6 04:41:10 np0005548918 ceph-osd[78376]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 11.766 iops: 3012.212 elapsed_sec: 0.996
Dec  6 04:41:10 np0005548918 ceph-osd[78376]: log_channel(cluster) log [WRN] : OSD bench result of 3012.211775 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec  6 04:41:10 np0005548918 ceph-osd[78376]: osd.2 0 waiting for initial osdmap
Dec  6 04:41:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2[78372]: 2025-12-06T09:41:10.221+0000 7f20dcccd640 -1 osd.2 0 waiting for initial osdmap
Dec  6 04:41:10 np0005548918 ceph-osd[78376]: osd.2 34 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec  6 04:41:10 np0005548918 ceph-osd[78376]: osd.2 34 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Dec  6 04:41:10 np0005548918 ceph-osd[78376]: osd.2 34 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec  6 04:41:10 np0005548918 ceph-osd[78376]: osd.2 34 check_osdmap_features require_osd_release unknown -> squid
Dec  6 04:41:10 np0005548918 ceph-osd[78376]: osd.2 34 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec  6 04:41:10 np0005548918 ceph-osd[78376]: osd.2 34 set_numa_affinity not setting numa affinity
Dec  6 04:41:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-2[78372]: 2025-12-06T09:41:10.256+0000 7f20d82f5640 -1 osd.2 34 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec  6 04:41:10 np0005548918 ceph-osd[78376]: osd.2 34 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Dec  6 04:41:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e34 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:10 np0005548918 podman[79467]: 2025-12-06 09:41:10.673263214 +0000 UTC m=+0.063832836 container exec 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  6 04:41:10 np0005548918 podman[79467]: 2025-12-06 09:41:10.766742314 +0000 UTC m=+0.157311946 container exec_died 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 34 tick checking mon for new map
Dec  6 04:41:11 np0005548918 ceph-mon[75798]: OSD bench result of 3012.211775 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec  6 04:41:11 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:11 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:11 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:11 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e35 e35: 3 total, 3 up, 3 in
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 35 state: booting -> active
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[5.13( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35) [2] r=0 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[2.15( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[5.12( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35) [2] r=0 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[5.8( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35) [2] r=0 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[2.c( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[2.10( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[5.b( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35) [2] r=0 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[2.d( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[5.d( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35) [2] r=0 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[2.13( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[2.a( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[3.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[5.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=27/27 les/c/f=28/28/0 sis=35) [2] r=0 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[4.6( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35) [2] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[3.8( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[4.2( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35) [2] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[2.1b( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[4.3( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35) [2] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[4.1d( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35) [2] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[4.1c( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35) [2] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[4.19( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35) [2] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[3.1b( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[4.14( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35) [2] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[4.1f( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[2.18( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[3.15( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[2.12( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[2.f( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[3.e( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[4.8( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[3.11( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[4.15( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[5.4( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[2.b( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[4.9( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[2.5( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[4.1( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[3.9( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[5.e( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[3.1a( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[2.1c( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[5.1a( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[2.1d( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 35 pg[3.1d( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:41:12 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:41:11] ENGINE Bus STARTING
Dec  6 04:41:12 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:12 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:41:11] ENGINE Serving on https://192.168.122.100:7150
Dec  6 04:41:12 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:41:11] ENGINE Client ('192.168.122.100', 54474) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  6 04:41:12 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:12 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:12 np0005548918 ceph-mon[75798]: osd.2 [v2:192.168.122.102:6800/709563040,v1:192.168.122.102:6801/709563040] boot
Dec  6 04:41:12 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:41:11] ENGINE Serving on http://192.168.122.100:8765
Dec  6 04:41:12 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:41:11] ENGINE Bus STARTED
Dec  6 04:41:12 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:12 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:12 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e36 e36: 3 total, 3 up, 3 in
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[5.1a( empty local-lis/les=35/36 n=0 ec=27/19 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[2.b( empty local-lis/les=35/36 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[2.1c( empty local-lis/les=35/36 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[2.1d( empty local-lis/les=35/36 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[5.4( empty local-lis/les=35/36 n=0 ec=27/19 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[2.13( empty local-lis/les=35/36 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[2.d( empty local-lis/les=35/36 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[4.15( empty local-lis/les=35/36 n=0 ec=25/17 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[3.9( empty local-lis/les=35/36 n=0 ec=24/15 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[3.11( empty local-lis/les=35/36 n=0 ec=24/15 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[2.a( empty local-lis/les=35/36 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[2.10( empty local-lis/les=35/36 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[5.d( empty local-lis/les=35/36 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35) [2] r=0 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[5.b( empty local-lis/les=35/36 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35) [2] r=0 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[4.9( empty local-lis/les=35/36 n=0 ec=25/17 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[4.8( empty local-lis/les=35/36 n=0 ec=25/17 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[2.c( empty local-lis/les=35/36 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[3.1d( empty local-lis/les=35/36 n=0 ec=24/15 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[5.12( empty local-lis/les=35/36 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35) [2] r=0 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[2.15( empty local-lis/les=35/36 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[5.8( empty local-lis/les=35/36 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35) [2] r=0 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[3.e( empty local-lis/les=35/36 n=0 ec=24/15 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[3.15( empty local-lis/les=35/36 n=0 ec=24/15 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[5.13( empty local-lis/les=35/36 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35) [2] r=0 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[2.12( empty local-lis/les=35/36 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[4.14( empty local-lis/les=35/36 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35) [2] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[4.1( empty local-lis/les=35/36 n=0 ec=25/17 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[3.0( empty local-lis/les=35/36 n=0 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[4.6( empty local-lis/les=35/36 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35) [2] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[5.e( empty local-lis/les=35/36 n=0 ec=27/19 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[5.0( empty local-lis/les=35/36 n=0 ec=19/19 lis/c=27/27 les/c/f=28/28/0 sis=35) [2] r=0 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[3.8( empty local-lis/les=35/36 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[3.1a( empty local-lis/les=35/36 n=0 ec=24/15 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[2.1b( empty local-lis/les=35/36 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[4.1d( empty local-lis/les=35/36 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35) [2] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[2.f( empty local-lis/les=35/36 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[2.5( empty local-lis/les=35/36 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[4.3( empty local-lis/les=35/36 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35) [2] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[4.1c( empty local-lis/les=35/36 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35) [2] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[4.19( empty local-lis/les=35/36 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35) [2] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[2.18( empty local-lis/les=35/36 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[4.2( empty local-lis/les=35/36 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35) [2] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[3.1b( empty local-lis/les=35/36 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=35) [2] r=0 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 36 pg[4.1f( empty local-lis/les=35/36 n=0 ec=25/17 lis/c=29/29 les/c/f=30/30/0 sis=35) [2] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Dec  6 04:41:12 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Dec  6 04:41:12 np0005548918 podman[79795]: 2025-12-06 09:41:12.981163023 +0000 UTC m=+0.061468311 container create 070ea61ec26be3d4d929d5321c5b5bcc39730a3589de3a4dcf15ab8c2d9c2924 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_snyder, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  6 04:41:13 np0005548918 systemd[1]: Started libpod-conmon-070ea61ec26be3d4d929d5321c5b5bcc39730a3589de3a4dcf15ab8c2d9c2924.scope.
Dec  6 04:41:13 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:41:13 np0005548918 podman[79795]: 2025-12-06 09:41:12.946540572 +0000 UTC m=+0.026845870 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:41:13 np0005548918 podman[79795]: 2025-12-06 09:41:13.047241439 +0000 UTC m=+0.127546747 container init 070ea61ec26be3d4d929d5321c5b5bcc39730a3589de3a4dcf15ab8c2d9c2924 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_snyder, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:41:13 np0005548918 podman[79795]: 2025-12-06 09:41:13.053456018 +0000 UTC m=+0.133761306 container start 070ea61ec26be3d4d929d5321c5b5bcc39730a3589de3a4dcf15ab8c2d9c2924 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_snyder, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:41:13 np0005548918 podman[79795]: 2025-12-06 09:41:13.056611234 +0000 UTC m=+0.136916552 container attach 070ea61ec26be3d4d929d5321c5b5bcc39730a3589de3a4dcf15ab8c2d9c2924 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_snyder, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:41:13 np0005548918 funny_snyder[79812]: 167 167
Dec  6 04:41:13 np0005548918 systemd[1]: libpod-070ea61ec26be3d4d929d5321c5b5bcc39730a3589de3a4dcf15ab8c2d9c2924.scope: Deactivated successfully.
Dec  6 04:41:13 np0005548918 podman[79795]: 2025-12-06 09:41:13.059482592 +0000 UTC m=+0.139787890 container died 070ea61ec26be3d4d929d5321c5b5bcc39730a3589de3a4dcf15ab8c2d9c2924 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_snyder, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:41:13 np0005548918 systemd[1]: var-lib-containers-storage-overlay-0c70b900dd70dc3a5171d1785371797158506b6e06900b2e9e7c6073fe4a0146-merged.mount: Deactivated successfully.
Dec  6 04:41:13 np0005548918 podman[79795]: 2025-12-06 09:41:13.104994839 +0000 UTC m=+0.185300147 container remove 070ea61ec26be3d4d929d5321c5b5bcc39730a3589de3a4dcf15ab8c2d9c2924 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_snyder, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True)
Dec  6 04:41:13 np0005548918 systemd[1]: libpod-conmon-070ea61ec26be3d4d929d5321c5b5bcc39730a3589de3a4dcf15ab8c2d9c2924.scope: Deactivated successfully.
Dec  6 04:41:13 np0005548918 podman[79836]: 2025-12-06 09:41:13.278831814 +0000 UTC m=+0.048303974 container create f78ca940177e1cd115b4890e8365aa55b3a924e7f7188e7486599c9a4c7d962d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_mayer, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Dec  6 04:41:13 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/trash_purge_schedule"}]: dispatch
Dec  6 04:41:13 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Dec  6 04:41:13 np0005548918 ceph-mon[75798]: Adjusting osd_memory_target on compute-0 to 128.0M
Dec  6 04:41:13 np0005548918 ceph-mon[75798]: Unable to set osd_memory_target on compute-0 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Dec  6 04:41:13 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec  6 04:41:13 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:13 np0005548918 systemd[1]: Started libpod-conmon-f78ca940177e1cd115b4890e8365aa55b3a924e7f7188e7486599c9a4c7d962d.scope.
Dec  6 04:41:13 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:41:13 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fba815efbcd79dd4cbfe132525e46c9b25f7ca7670a28dbc4722ca0b24c946a7/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:13 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fba815efbcd79dd4cbfe132525e46c9b25f7ca7670a28dbc4722ca0b24c946a7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:13 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fba815efbcd79dd4cbfe132525e46c9b25f7ca7670a28dbc4722ca0b24c946a7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:13 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fba815efbcd79dd4cbfe132525e46c9b25f7ca7670a28dbc4722ca0b24c946a7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:13 np0005548918 podman[79836]: 2025-12-06 09:41:13.354418668 +0000 UTC m=+0.123890838 container init f78ca940177e1cd115b4890e8365aa55b3a924e7f7188e7486599c9a4c7d962d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_mayer, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec  6 04:41:13 np0005548918 podman[79836]: 2025-12-06 09:41:13.261162224 +0000 UTC m=+0.030634374 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:41:13 np0005548918 podman[79836]: 2025-12-06 09:41:13.360102353 +0000 UTC m=+0.129574513 container start f78ca940177e1cd115b4890e8365aa55b3a924e7f7188e7486599c9a4c7d962d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_mayer, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec  6 04:41:13 np0005548918 podman[79836]: 2025-12-06 09:41:13.363508355 +0000 UTC m=+0.132980565 container attach f78ca940177e1cd115b4890e8365aa55b3a924e7f7188e7486599c9a4c7d962d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_mayer, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:41:13 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Dec  6 04:41:13 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]: [
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:    {
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:        "available": false,
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:        "being_replaced": false,
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:        "ceph_device_lvm": false,
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:        "device_id": "QEMU_DVD-ROM_QM00001",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:        "lsm_data": {},
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:        "lvs": [],
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:        "path": "/dev/sr0",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:        "rejected_reasons": [
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "Has a FileSystem",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "Insufficient space (<5GB)"
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:        ],
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:        "sys_api": {
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "actuators": null,
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "device_nodes": [
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:                "sr0"
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            ],
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "devname": "sr0",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "human_readable_size": "482.00 KB",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "id_bus": "ata",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "model": "QEMU DVD-ROM",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "nr_requests": "2",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "parent": "/dev/sr0",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "partitions": {},
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "path": "/dev/sr0",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "removable": "1",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "rev": "2.5+",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "ro": "0",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "rotational": "1",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "sas_address": "",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "sas_device_handle": "",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "scheduler_mode": "mq-deadline",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "sectors": 0,
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "sectorsize": "2048",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "size": 493568.0,
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "support_discard": "2048",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "type": "disk",
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:            "vendor": "QEMU"
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:        }
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]:    }
Dec  6 04:41:14 np0005548918 sleepy_mayer[79853]: ]
Dec  6 04:41:14 np0005548918 systemd[1]: libpod-f78ca940177e1cd115b4890e8365aa55b3a924e7f7188e7486599c9a4c7d962d.scope: Deactivated successfully.
Dec  6 04:41:14 np0005548918 podman[79836]: 2025-12-06 09:41:14.160367505 +0000 UTC m=+0.929839695 container died f78ca940177e1cd115b4890e8365aa55b3a924e7f7188e7486599c9a4c7d962d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_mayer, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:41:14 np0005548918 systemd[1]: var-lib-containers-storage-overlay-fba815efbcd79dd4cbfe132525e46c9b25f7ca7670a28dbc4722ca0b24c946a7-merged.mount: Deactivated successfully.
Dec  6 04:41:14 np0005548918 podman[79836]: 2025-12-06 09:41:14.221744912 +0000 UTC m=+0.991217072 container remove f78ca940177e1cd115b4890e8365aa55b3a924e7f7188e7486599c9a4c7d962d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_mayer, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  6 04:41:14 np0005548918 systemd[1]: libpod-conmon-f78ca940177e1cd115b4890e8365aa55b3a924e7f7188e7486599c9a4c7d962d.scope: Deactivated successfully.
Dec  6 04:41:14 np0005548918 ceph-mon[75798]: Adjusting osd_memory_target on compute-1 to 127.9M
Dec  6 04:41:14 np0005548918 ceph-mon[75798]: Unable to set osd_memory_target on compute-1 to 134211993: error parsing value: Value '134211993' is below minimum 939524096
Dec  6 04:41:14 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:14 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 2.d scrub starts
Dec  6 04:41:14 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 2.d scrub ok
Dec  6 04:41:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:15 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:15 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec  6 04:41:15 np0005548918 ceph-mon[75798]: Adjusting osd_memory_target on compute-2 to 127.9M
Dec  6 04:41:15 np0005548918 ceph-mon[75798]: Unable to set osd_memory_target on compute-2 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Dec  6 04:41:15 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:41:15 np0005548918 ceph-mon[75798]: Updating compute-0:/etc/ceph/ceph.conf
Dec  6 04:41:15 np0005548918 ceph-mon[75798]: Updating compute-1:/etc/ceph/ceph.conf
Dec  6 04:41:15 np0005548918 ceph-mon[75798]: Updating compute-2:/etc/ceph/ceph.conf
Dec  6 04:41:15 np0005548918 ceph-mon[75798]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:15 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Dec  6 04:41:15 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Dec  6 04:41:16 np0005548918 ceph-mon[75798]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:41:16 np0005548918 ceph-mon[75798]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:41:16 np0005548918 ceph-mon[75798]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:41:16 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/986641805' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Dec  6 04:41:16 np0005548918 ceph-mon[75798]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:41:16 np0005548918 ceph-mon[75798]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:41:16 np0005548918 ceph-mon[75798]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr respawn  1: '-n'
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr respawn  2: 'mgr.compute-2.oazbvn'
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr respawn  3: '-f'
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr respawn  4: '--setuser'
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr respawn  5: 'ceph'
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr respawn  6: '--setgroup'
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr respawn  7: 'ceph'
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr respawn  8: '--default-log-to-file=false'
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr respawn  9: '--default-log-to-journald=true'
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr respawn  10: '--default-log-to-stderr=false'
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr respawn  exe_path /proc/self/exe
Dec  6 04:41:16 np0005548918 systemd[1]: session-33.scope: Deactivated successfully.
Dec  6 04:41:16 np0005548918 systemd[1]: session-33.scope: Consumed 5.390s CPU time.
Dec  6 04:41:16 np0005548918 systemd-logind[800]: Session 33 logged out. Waiting for processes to exit.
Dec  6 04:41:16 np0005548918 systemd-logind[800]: Removed session 33.
Dec  6 04:41:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: ignoring --setuser ceph since I am not root
Dec  6 04:41:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: ignoring --setgroup ceph since I am not root
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: pidfile_write: ignore empty --pid-file
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'alerts'
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'balancer'
Dec  6 04:41:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:16.596+0000 7f59299f9140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:41:16 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'cephadm'
Dec  6 04:41:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:16.679+0000 7f59299f9140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:41:16 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Dec  6 04:41:16 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Dec  6 04:41:17 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/986641805' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Dec  6 04:41:17 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/2772325777' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Dec  6 04:41:17 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'crash'
Dec  6 04:41:17 np0005548918 ceph-mgr[76108]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:41:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:17.468+0000 7f59299f9140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:41:17 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'dashboard'
Dec  6 04:41:17 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Dec  6 04:41:17 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Dec  6 04:41:18 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'devicehealth'
Dec  6 04:41:18 np0005548918 ceph-mgr[76108]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:18.087+0000 7f59299f9140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'diskprediction_local'
Dec  6 04:41:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  6 04:41:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  6 04:41:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]:  from numpy import show_config as show_numpy_config
Dec  6 04:41:18 np0005548918 ceph-mgr[76108]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:18.257+0000 7f59299f9140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'influx'
Dec  6 04:41:18 np0005548918 ceph-mgr[76108]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'insights'
Dec  6 04:41:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:18.325+0000 7f59299f9140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'iostat'
Dec  6 04:41:18 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/2772325777' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Dec  6 04:41:18 np0005548918 ceph-mgr[76108]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'k8sevents'
Dec  6 04:41:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:18.462+0000 7f59299f9140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:41:18 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Dec  6 04:41:18 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Dec  6 04:41:18 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'localpool'
Dec  6 04:41:18 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'mds_autoscaler'
Dec  6 04:41:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'mirroring'
Dec  6 04:41:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'nfs'
Dec  6 04:41:19 np0005548918 ceph-mgr[76108]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:41:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'orchestrator'
Dec  6 04:41:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:19.426+0000 7f59299f9140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:41:19 np0005548918 ceph-mgr[76108]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'osd_perf_query'
Dec  6 04:41:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:19.643+0000 7f59299f9140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:19 np0005548918 ceph-mgr[76108]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:41:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'osd_support'
Dec  6 04:41:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:19.717+0000 7f59299f9140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:41:19 np0005548918 ceph-mgr[76108]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:41:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:19.779+0000 7f59299f9140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:41:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'pg_autoscaler'
Dec  6 04:41:19 np0005548918 ceph-mgr[76108]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:41:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'progress'
Dec  6 04:41:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:19.856+0000 7f59299f9140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:41:19 np0005548918 ceph-mgr[76108]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:41:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:19.926+0000 7f59299f9140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:41:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'prometheus'
Dec  6 04:41:20 np0005548918 ceph-mgr[76108]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:20.255+0000 7f59299f9140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rbd_support'
Dec  6 04:41:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:20 np0005548918 ceph-mgr[76108]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'restful'
Dec  6 04:41:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:20.351+0000 7f59299f9140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rgw'
Dec  6 04:41:20 np0005548918 ceph-mgr[76108]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:20.787+0000 7f59299f9140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:41:20 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rook'
Dec  6 04:41:21 np0005548918 ceph-mgr[76108]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:21.347+0000 7f59299f9140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'selftest'
Dec  6 04:41:21 np0005548918 ceph-mgr[76108]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:21.423+0000 7f59299f9140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'snap_schedule'
Dec  6 04:41:21 np0005548918 ceph-mgr[76108]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:21.505+0000 7f59299f9140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'stats'
Dec  6 04:41:21 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'status'
Dec  6 04:41:21 np0005548918 ceph-mgr[76108]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'telegraf'
Dec  6 04:41:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:21.655+0000 7f59299f9140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548918 ceph-mgr[76108]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:21.725+0000 7f59299f9140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'telemetry'
Dec  6 04:41:21 np0005548918 ceph-mgr[76108]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:41:21 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'test_orchestrator'
Dec  6 04:41:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:21.876+0000 7f59299f9140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'volumes'
Dec  6 04:41:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:22.094+0000 7f59299f9140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'zabbix'
Dec  6 04:41:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:22.365+0000 7f59299f9140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:22.440+0000 7f59299f9140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: ms_deliver_dispatch: unhandled message 0x55b99e32d860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  6 04:41:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: ignoring --setuser ceph since I am not root
Dec  6 04:41:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: ignoring --setgroup ceph since I am not root
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: pidfile_write: ignore empty --pid-file
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'alerts'
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'balancer'
Dec  6 04:41:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:22.659+0000 7fd5c432f140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e37 e37: 3 total, 3 up, 3 in
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:41:22 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'cephadm'
Dec  6 04:41:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:22.742+0000 7fd5c432f140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:41:23 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'crash'
Dec  6 04:41:23 np0005548918 ceph-mgr[76108]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:41:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:23.522+0000 7fd5c432f140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:41:23 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'dashboard'
Dec  6 04:41:23 np0005548918 ceph-mon[75798]: Active manager daemon compute-0.qhdjwa restarted
Dec  6 04:41:23 np0005548918 ceph-mon[75798]: Activating manager daemon compute-0.qhdjwa
Dec  6 04:41:24 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'devicehealth'
Dec  6 04:41:24 np0005548918 ceph-mgr[76108]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:41:24 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'diskprediction_local'
Dec  6 04:41:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:24.132+0000 7fd5c432f140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:41:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  6 04:41:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  6 04:41:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]:  from numpy import show_config as show_numpy_config
Dec  6 04:41:24 np0005548918 ceph-mgr[76108]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:41:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:24.290+0000 7fd5c432f140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:41:24 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'influx'
Dec  6 04:41:24 np0005548918 ceph-mgr[76108]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:41:24 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'insights'
Dec  6 04:41:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:24.361+0000 7fd5c432f140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:41:24 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'iostat'
Dec  6 04:41:24 np0005548918 ceph-mgr[76108]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:41:24 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'k8sevents'
Dec  6 04:41:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:24.496+0000 7fd5c432f140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:41:24 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'localpool'
Dec  6 04:41:24 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'mds_autoscaler'
Dec  6 04:41:25 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'mirroring'
Dec  6 04:41:25 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'nfs'
Dec  6 04:41:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:25 np0005548918 ceph-mgr[76108]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'orchestrator'
Dec  6 04:41:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:25.448+0000 7fd5c432f140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548918 ceph-mgr[76108]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'osd_perf_query'
Dec  6 04:41:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:25.664+0000 7fd5c432f140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548918 ceph-mgr[76108]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'osd_support'
Dec  6 04:41:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:25.736+0000 7fd5c432f140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548918 ceph-mgr[76108]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'pg_autoscaler'
Dec  6 04:41:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:25.796+0000 7fd5c432f140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548918 ceph-mgr[76108]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'progress'
Dec  6 04:41:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:25.871+0000 7fd5c432f140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548918 ceph-mgr[76108]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:41:25 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'prometheus'
Dec  6 04:41:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:25.937+0000 7fd5c432f140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:41:26 np0005548918 ceph-mgr[76108]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:41:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:26.259+0000 7fd5c432f140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:41:26 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rbd_support'
Dec  6 04:41:26 np0005548918 ceph-mgr[76108]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:41:26 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'restful'
Dec  6 04:41:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:26.353+0000 7fd5c432f140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:41:26 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rgw'
Dec  6 04:41:26 np0005548918 systemd[1]: Stopping User Manager for UID 42477...
Dec  6 04:41:26 np0005548918 systemd[72489]: Activating special unit Exit the Session...
Dec  6 04:41:26 np0005548918 systemd[72489]: Stopped target Main User Target.
Dec  6 04:41:26 np0005548918 systemd[72489]: Stopped target Basic System.
Dec  6 04:41:26 np0005548918 systemd[72489]: Stopped target Paths.
Dec  6 04:41:26 np0005548918 systemd[72489]: Stopped target Sockets.
Dec  6 04:41:26 np0005548918 systemd[72489]: Stopped target Timers.
Dec  6 04:41:26 np0005548918 systemd[72489]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec  6 04:41:26 np0005548918 systemd[72489]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  6 04:41:26 np0005548918 systemd[72489]: Closed D-Bus User Message Bus Socket.
Dec  6 04:41:26 np0005548918 systemd[72489]: Stopped Create User's Volatile Files and Directories.
Dec  6 04:41:26 np0005548918 systemd[72489]: Removed slice User Application Slice.
Dec  6 04:41:26 np0005548918 systemd[72489]: Reached target Shutdown.
Dec  6 04:41:26 np0005548918 systemd[72489]: Finished Exit the Session.
Dec  6 04:41:26 np0005548918 systemd[72489]: Reached target Exit the Session.
Dec  6 04:41:26 np0005548918 systemd[1]: user@42477.service: Deactivated successfully.
Dec  6 04:41:26 np0005548918 systemd[1]: Stopped User Manager for UID 42477.
Dec  6 04:41:26 np0005548918 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Dec  6 04:41:26 np0005548918 systemd[1]: run-user-42477.mount: Deactivated successfully.
Dec  6 04:41:26 np0005548918 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Dec  6 04:41:26 np0005548918 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Dec  6 04:41:26 np0005548918 systemd[1]: Removed slice User Slice of UID 42477.
Dec  6 04:41:26 np0005548918 systemd[1]: user-42477.slice: Consumed 1min 2.571s CPU time.
Dec  6 04:41:26 np0005548918 ceph-mgr[76108]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:41:26 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rook'
Dec  6 04:41:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:26.785+0000 7fd5c432f140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548918 ceph-mgr[76108]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'selftest'
Dec  6 04:41:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:27.323+0000 7fd5c432f140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548918 ceph-mgr[76108]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'snap_schedule'
Dec  6 04:41:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:27.400+0000 7fd5c432f140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548918 ceph-mgr[76108]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'stats'
Dec  6 04:41:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:27.481+0000 7fd5c432f140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'status'
Dec  6 04:41:27 np0005548918 ceph-mgr[76108]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'telegraf'
Dec  6 04:41:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:27.625+0000 7fd5c432f140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548918 ceph-mgr[76108]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'telemetry'
Dec  6 04:41:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:27.693+0000 7fd5c432f140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548918 ceph-mgr[76108]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:41:27 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'test_orchestrator'
Dec  6 04:41:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:27.845+0000 7fd5c432f140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:41:28 np0005548918 ceph-mgr[76108]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:28 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'volumes'
Dec  6 04:41:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:28.065+0000 7fd5c432f140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:41:28 np0005548918 ceph-mgr[76108]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:41:28 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'zabbix'
Dec  6 04:41:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:28.335+0000 7fd5c432f140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:41:28 np0005548918 ceph-mgr[76108]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:41:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:41:28.405+0000 7fd5c432f140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:41:28 np0005548918 ceph-mgr[76108]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  6 04:41:28 np0005548918 ceph-mgr[76108]: mgr load Constructed class from module: dashboard
Dec  6 04:41:28 np0005548918 ceph-mgr[76108]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Dec  6 04:41:28 np0005548918 ceph-mgr[76108]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec  6 04:41:28 np0005548918 ceph-mgr[76108]: ms_deliver_dispatch: unhandled message 0x5653e86f9860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Dec  6 04:41:28 np0005548918 ceph-mgr[76108]: [dashboard INFO root] Starting engine...
Dec  6 04:41:28 np0005548918 ceph-mgr[76108]: [dashboard INFO root] Engine started...
Dec  6 04:41:28 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e38 e38: 3 total, 3 up, 3 in
Dec  6 04:41:29 np0005548918 systemd-logind[800]: New session 34 of user ceph-admin.
Dec  6 04:41:29 np0005548918 systemd[1]: Created slice User Slice of UID 42477.
Dec  6 04:41:29 np0005548918 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec  6 04:41:29 np0005548918 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec  6 04:41:29 np0005548918 systemd[1]: Starting User Manager for UID 42477...
Dec  6 04:41:29 np0005548918 systemd[81631]: Queued start job for default target Main User Target.
Dec  6 04:41:29 np0005548918 systemd[81631]: Created slice User Application Slice.
Dec  6 04:41:29 np0005548918 systemd[81631]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  6 04:41:29 np0005548918 systemd[81631]: Started Daily Cleanup of User's Temporary Directories.
Dec  6 04:41:29 np0005548918 systemd[81631]: Reached target Paths.
Dec  6 04:41:29 np0005548918 systemd[81631]: Reached target Timers.
Dec  6 04:41:29 np0005548918 systemd[81631]: Starting D-Bus User Message Bus Socket...
Dec  6 04:41:29 np0005548918 systemd[81631]: Starting Create User's Volatile Files and Directories...
Dec  6 04:41:29 np0005548918 systemd[81631]: Finished Create User's Volatile Files and Directories.
Dec  6 04:41:29 np0005548918 systemd[81631]: Listening on D-Bus User Message Bus Socket.
Dec  6 04:41:29 np0005548918 systemd[81631]: Reached target Sockets.
Dec  6 04:41:29 np0005548918 systemd[81631]: Reached target Basic System.
Dec  6 04:41:29 np0005548918 systemd[81631]: Reached target Main User Target.
Dec  6 04:41:29 np0005548918 systemd[81631]: Startup finished in 124ms.
Dec  6 04:41:29 np0005548918 systemd[1]: Started User Manager for UID 42477.
Dec  6 04:41:29 np0005548918 systemd[1]: Started Session 34 of User ceph-admin.
Dec  6 04:41:29 np0005548918 ceph-mon[75798]: Active manager daemon compute-0.qhdjwa restarted
Dec  6 04:41:29 np0005548918 ceph-mon[75798]: Activating manager daemon compute-0.qhdjwa
Dec  6 04:41:29 np0005548918 ceph-mon[75798]: Manager daemon compute-0.qhdjwa is now available
Dec  6 04:41:29 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/mirror_snapshot_schedule"}]: dispatch
Dec  6 04:41:29 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/trash_purge_schedule"}]: dispatch
Dec  6 04:41:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e2 new map
Dec  6 04:41:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e2 print_map#012e2#012btime 2025-12-06T09:41:29:967825+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:41:29.967778+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 
Dec  6 04:41:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e39 e39: 3 total, 3 up, 3 in
Dec  6 04:41:30 np0005548918 podman[81770]: 2025-12-06 09:41:30.260294486 +0000 UTC m=+0.057926086 container exec 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:30 np0005548918 podman[81770]: 2025-12-06 09:41:30.353754656 +0000 UTC m=+0.151386296 container exec_died 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:41:30] ENGINE Bus STARTING
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:41:30] ENGINE Serving on https://192.168.122.100:7150
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:41:30] ENGINE Client ('192.168.122.100', 59318) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:41:30] ENGINE Serving on http://192.168.122.100:8765
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:41:30] ENGINE Bus STARTED
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:30 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:31 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:31 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:31 np0005548918 ceph-mon[75798]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Dec  6 04:41:31 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:31 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:31 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:31 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: Adjusting osd_memory_target on compute-2 to 127.9M
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: Unable to set osd_memory_target on compute-2 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: Adjusting osd_memory_target on compute-0 to 128.0M
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: Unable to set osd_memory_target on compute-0 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: Adjusting osd_memory_target on compute-1 to 127.9M
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: Unable to set osd_memory_target on compute-1 to 134211993: error parsing value: Value '134211993' is below minimum 939524096
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: Updating compute-0:/etc/ceph/ceph.conf
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: Updating compute-1:/etc/ceph/ceph.conf
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: Updating compute-2:/etc/ceph/ceph.conf
Dec  6 04:41:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e40 e40: 3 total, 3 up, 3 in
Dec  6 04:41:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e41 e41: 3 total, 3 up, 3 in
Dec  6 04:41:34 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Dec  6 04:41:34 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Dec  6 04:41:34 np0005548918 ceph-mon[75798]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:41:34 np0005548918 ceph-mon[75798]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:41:34 np0005548918 ceph-mon[75798]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:41:34 np0005548918 ceph-mon[75798]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:41:34 np0005548918 ceph-mon[75798]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e42 e42: 3 total, 3 up, 3 in
Dec  6 04:41:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:36 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:36 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:36 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:36 np0005548918 ceph-mon[75798]: Deploying daemon node-exporter.compute-0 on compute-0
Dec  6 04:41:37 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/351927990' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Dec  6 04:41:37 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/351927990' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Dec  6 04:41:38 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:38 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:38 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:39 np0005548918 ceph-mon[75798]: Deploying daemon node-exporter.compute-1 on compute-1
Dec  6 04:41:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:41 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/1032166629' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Dec  6 04:41:41 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:41 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:41 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:41 np0005548918 systemd[1]: Reloading.
Dec  6 04:41:42 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:41:42 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:41:42 np0005548918 systemd[1]: Reloading.
Dec  6 04:41:42 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:41:42 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:41:42 np0005548918 systemd[1]: Starting Ceph node-exporter.compute-2 for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:41:42 np0005548918 bash[83112]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Dec  6 04:41:43 np0005548918 ceph-mon[75798]: Deploying daemon node-exporter.compute-2 on compute-2
Dec  6 04:41:43 np0005548918 bash[83112]: Getting image source signatures
Dec  6 04:41:43 np0005548918 bash[83112]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Dec  6 04:41:43 np0005548918 bash[83112]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Dec  6 04:41:43 np0005548918 bash[83112]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Dec  6 04:41:43 np0005548918 bash[83112]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Dec  6 04:41:43 np0005548918 bash[83112]: Writing manifest to image destination
Dec  6 04:41:43 np0005548918 podman[83112]: 2025-12-06 09:41:43.707499015 +0000 UTC m=+1.031322883 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Dec  6 04:41:43 np0005548918 podman[83112]: 2025-12-06 09:41:43.719733887 +0000 UTC m=+1.043557755 container create 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:41:43 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b0f353fd119918dc84c81209355177868114f1cfe9b6377d6faa768ac8f466b/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:43 np0005548918 podman[83112]: 2025-12-06 09:41:43.762346665 +0000 UTC m=+1.086170563 container init 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:41:43 np0005548918 podman[83112]: 2025-12-06 09:41:43.76765647 +0000 UTC m=+1.091480338 container start 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:41:43 np0005548918 bash[83112]: 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.774Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.774Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.775Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.775Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.775Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.775Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec  6 04:41:43 np0005548918 systemd[1]: Started Ceph node-exporter.compute-2 for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=arp
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=bcache
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=bonding
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=btrfs
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=conntrack
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=cpu
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=diskstats
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=dmi
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=edac
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=entropy
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=filefd
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=filesystem
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=hwmon
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=infiniband
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=ipvs
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=loadavg
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=mdadm
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=meminfo
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=netclass
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=netdev
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=netstat
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=nfs
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=nfsd
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=nvme
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=os
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=pressure
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=rapl
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=schedstat
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=selinux
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=sockstat
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=softnet
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=stat
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=tapestats
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=textfile
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=thermal_zone
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=time
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=uname
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=vmstat
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=xfs
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.776Z caller=node_exporter.go:117 level=info collector=zfs
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.777Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Dec  6 04:41:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2[83185]: ts=2025-12-06T09:41:43.777Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Dec  6 04:41:44 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:44 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:44 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:44 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:44 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:41:44 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:49 np0005548918 podman[83287]: 2025-12-06 09:41:49.570207805 +0000 UTC m=+0.036859994 container create 3f02618409fa016efc23f874eef2bd38ebc650e5694af7c95b8ed1beb3ee6025 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_goldstine, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  6 04:41:49 np0005548918 systemd[1]: Started libpod-conmon-3f02618409fa016efc23f874eef2bd38ebc650e5694af7c95b8ed1beb3ee6025.scope.
Dec  6 04:41:49 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:41:49 np0005548918 podman[83287]: 2025-12-06 09:41:49.553420059 +0000 UTC m=+0.020072278 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:41:49 np0005548918 podman[83287]: 2025-12-06 09:41:49.654446674 +0000 UTC m=+0.121098893 container init 3f02618409fa016efc23f874eef2bd38ebc650e5694af7c95b8ed1beb3ee6025 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_goldstine, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  6 04:41:49 np0005548918 podman[83287]: 2025-12-06 09:41:49.660435747 +0000 UTC m=+0.127087936 container start 3f02618409fa016efc23f874eef2bd38ebc650e5694af7c95b8ed1beb3ee6025 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_goldstine, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:41:49 np0005548918 podman[83287]: 2025-12-06 09:41:49.663697196 +0000 UTC m=+0.130349405 container attach 3f02618409fa016efc23f874eef2bd38ebc650e5694af7c95b8ed1beb3ee6025 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_goldstine, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:41:49 np0005548918 flamboyant_goldstine[83303]: 167 167
Dec  6 04:41:49 np0005548918 systemd[1]: libpod-3f02618409fa016efc23f874eef2bd38ebc650e5694af7c95b8ed1beb3ee6025.scope: Deactivated successfully.
Dec  6 04:41:49 np0005548918 conmon[83303]: conmon 3f02618409fa016efc23 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3f02618409fa016efc23f874eef2bd38ebc650e5694af7c95b8ed1beb3ee6025.scope/container/memory.events
Dec  6 04:41:49 np0005548918 podman[83287]: 2025-12-06 09:41:49.665882535 +0000 UTC m=+0.132534724 container died 3f02618409fa016efc23f874eef2bd38ebc650e5694af7c95b8ed1beb3ee6025 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_goldstine, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:41:49 np0005548918 systemd[1]: var-lib-containers-storage-overlay-cc8770e755bc47bc1fbcc4fa9785e8b20d554f23dd80ed6dd69c2383b1724c12-merged.mount: Deactivated successfully.
Dec  6 04:41:49 np0005548918 podman[83287]: 2025-12-06 09:41:49.702707896 +0000 UTC m=+0.169360085 container remove 3f02618409fa016efc23f874eef2bd38ebc650e5694af7c95b8ed1beb3ee6025 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_goldstine, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:41:49 np0005548918 systemd[1]: libpod-conmon-3f02618409fa016efc23f874eef2bd38ebc650e5694af7c95b8ed1beb3ee6025.scope: Deactivated successfully.
Dec  6 04:41:49 np0005548918 systemd[1]: Reloading.
Dec  6 04:41:49 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:41:49 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:41:49 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:49 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:49 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.qizhkr", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  6 04:41:49 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.qizhkr", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  6 04:41:49 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:49 np0005548918 ceph-mon[75798]: Deploying daemon rgw.rgw.compute-2.qizhkr on compute-2
Dec  6 04:41:50 np0005548918 systemd[1]: Reloading.
Dec  6 04:41:50 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:41:50 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:41:50 np0005548918 systemd[1]: Starting Ceph rgw.rgw.compute-2.qizhkr for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:41:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:50 np0005548918 podman[83444]: 2025-12-06 09:41:50.51376516 +0000 UTC m=+0.057815222 container create 6747465906f6c82f2499aa7eee737900d4a09ee01087a44326936beb1c506eeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-rgw-rgw-compute-2-qizhkr, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:41:50 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b71165b045d1ecec7d6d7b3319051d467f61c5578c7a1f20eb81b410d44a182/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:50 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b71165b045d1ecec7d6d7b3319051d467f61c5578c7a1f20eb81b410d44a182/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:50 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b71165b045d1ecec7d6d7b3319051d467f61c5578c7a1f20eb81b410d44a182/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:50 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b71165b045d1ecec7d6d7b3319051d467f61c5578c7a1f20eb81b410d44a182/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.qizhkr supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:50 np0005548918 podman[83444]: 2025-12-06 09:41:50.575007915 +0000 UTC m=+0.119057977 container init 6747465906f6c82f2499aa7eee737900d4a09ee01087a44326936beb1c506eeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-rgw-rgw-compute-2-qizhkr, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:41:50 np0005548918 podman[83444]: 2025-12-06 09:41:50.485582654 +0000 UTC m=+0.029632796 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:41:50 np0005548918 podman[83444]: 2025-12-06 09:41:50.581098231 +0000 UTC m=+0.125148273 container start 6747465906f6c82f2499aa7eee737900d4a09ee01087a44326936beb1c506eeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-rgw-rgw-compute-2-qizhkr, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:41:50 np0005548918 bash[83444]: 6747465906f6c82f2499aa7eee737900d4a09ee01087a44326936beb1c506eeb
Dec  6 04:41:50 np0005548918 systemd[1]: Started Ceph rgw.rgw.compute-2.qizhkr for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:41:50 np0005548918 radosgw[83463]: deferred set uid:gid to 167:167 (ceph:ceph)
Dec  6 04:41:50 np0005548918 radosgw[83463]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Dec  6 04:41:50 np0005548918 radosgw[83463]: framework: beast
Dec  6 04:41:50 np0005548918 radosgw[83463]: framework conf key: endpoint, val: 192.168.122.102:8082
Dec  6 04:41:50 np0005548918 radosgw[83463]: init_numa not setting numa affinity
Dec  6 04:41:51 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e43 e43: 3 total, 3 up, 3 in
Dec  6 04:41:51 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0)
Dec  6 04:41:51 np0005548918 ceph-mon[75798]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3027759423' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec  6 04:41:51 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:51 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:51 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:51 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.oqhsdh", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  6 04:41:51 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.oqhsdh", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  6 04:41:51 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:51 np0005548918 ceph-mon[75798]: Deploying daemon rgw.rgw.compute-1.oqhsdh on compute-1
Dec  6 04:41:52 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e44 e44: 3 total, 3 up, 3 in
Dec  6 04:41:52 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec  6 04:41:52 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.102:0/3027759423' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec  6 04:41:52 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:52 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:52 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:52 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.zktslo", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  6 04:41:52 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.zktslo", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  6 04:41:52 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:52 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Dec  6 04:41:52 np0005548918 radosgw[83463]: rgw main: failed to create zone with (17) File exists
Dec  6 04:41:52 np0005548918 radosgw[83463]: rgw main: failed to create zonegroup with (17) File exists
Dec  6 04:41:53 np0005548918 ceph-mon[75798]: Deploying daemon rgw.rgw.compute-0.zktslo on compute-0
Dec  6 04:41:53 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Dec  6 04:41:53 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Dec  6 04:41:53 np0005548918 ceph-mon[75798]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  6 04:41:54 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  6 04:41:54 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  6 04:41:54 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  6 04:41:54 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  6 04:41:54 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:54 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:54 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:54 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:54 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:54 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.czucwy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec  6 04:41:54 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.czucwy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec  6 04:41:54 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Dec  6 04:41:55 np0005548918 podman[84140]: 2025-12-06 09:41:55.234431571 +0000 UTC m=+0.063633701 container create a7b49cb1d59a6edb1cd23983be19e78e2281aa54f97965688e87d2b686fff77b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=thirsty_mestorf, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:41:55 np0005548918 systemd[1]: Started libpod-conmon-a7b49cb1d59a6edb1cd23983be19e78e2281aa54f97965688e87d2b686fff77b.scope.
Dec  6 04:41:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e46 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:41:55 np0005548918 podman[84140]: 2025-12-06 09:41:55.209825802 +0000 UTC m=+0.039027932 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:41:55 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:41:55 np0005548918 podman[84140]: 2025-12-06 09:41:55.357458795 +0000 UTC m=+0.186660965 container init a7b49cb1d59a6edb1cd23983be19e78e2281aa54f97965688e87d2b686fff77b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=thirsty_mestorf, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  6 04:41:55 np0005548918 podman[84140]: 2025-12-06 09:41:55.366765007 +0000 UTC m=+0.195967107 container start a7b49cb1d59a6edb1cd23983be19e78e2281aa54f97965688e87d2b686fff77b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=thirsty_mestorf, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:41:55 np0005548918 podman[84140]: 2025-12-06 09:41:55.372662528 +0000 UTC m=+0.201864628 container attach a7b49cb1d59a6edb1cd23983be19e78e2281aa54f97965688e87d2b686fff77b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=thirsty_mestorf, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Dec  6 04:41:55 np0005548918 thirsty_mestorf[84157]: 167 167
Dec  6 04:41:55 np0005548918 systemd[1]: libpod-a7b49cb1d59a6edb1cd23983be19e78e2281aa54f97965688e87d2b686fff77b.scope: Deactivated successfully.
Dec  6 04:41:55 np0005548918 conmon[84157]: conmon a7b49cb1d59a6edb1cd2 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a7b49cb1d59a6edb1cd23983be19e78e2281aa54f97965688e87d2b686fff77b.scope/container/memory.events
Dec  6 04:41:55 np0005548918 podman[84140]: 2025-12-06 09:41:55.377012595 +0000 UTC m=+0.206214725 container died a7b49cb1d59a6edb1cd23983be19e78e2281aa54f97965688e87d2b686fff77b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=thirsty_mestorf, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2)
Dec  6 04:41:55 np0005548918 systemd[1]: var-lib-containers-storage-overlay-d85f68753ae568828fff650f219bbe4b90459cd9f13205e9a9e29ed5f7994f34-merged.mount: Deactivated successfully.
Dec  6 04:41:55 np0005548918 podman[84140]: 2025-12-06 09:41:55.429561894 +0000 UTC m=+0.258763994 container remove a7b49cb1d59a6edb1cd23983be19e78e2281aa54f97965688e87d2b686fff77b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=thirsty_mestorf, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec  6 04:41:55 np0005548918 systemd[1]: libpod-conmon-a7b49cb1d59a6edb1cd23983be19e78e2281aa54f97965688e87d2b686fff77b.scope: Deactivated successfully.
Dec  6 04:41:55 np0005548918 systemd[1]: Reloading.
Dec  6 04:41:55 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:41:55 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:41:55 np0005548918 systemd[1]: Reloading.
Dec  6 04:41:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Dec  6 04:41:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Dec  6 04:41:55 np0005548918 ceph-mon[75798]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  6 04:41:55 np0005548918 ceph-mon[75798]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Dec  6 04:41:55 np0005548918 ceph-mon[75798]: Deploying daemon mds.cephfs.compute-2.czucwy on compute-2
Dec  6 04:41:55 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Dec  6 04:41:55 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Dec  6 04:41:55 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:41:55 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:41:56 np0005548918 systemd[1]: Starting Ceph mds.cephfs.compute-2.czucwy for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions", "format": "json"} v 0)
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/607080093' entity='client.admin' cmd=[{"prefix": "versions", "format": "json"}]: dispatch
Dec  6 04:41:56 np0005548918 podman[84300]: 2025-12-06 09:41:56.258267958 +0000 UTC m=+0.053759782 container create f7e5fe6f27c4186e3d87d6a775f55e4ff8479703f7103af0416838c801936627 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mds-cephfs-compute-2-czucwy, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:41:56 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c64c984d556c3489eb9d228c950e9b84fe557c449658e35d506ab3018f53e48/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:56 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c64c984d556c3489eb9d228c950e9b84fe557c449658e35d506ab3018f53e48/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:56 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c64c984d556c3489eb9d228c950e9b84fe557c449658e35d506ab3018f53e48/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:56 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c64c984d556c3489eb9d228c950e9b84fe557c449658e35d506ab3018f53e48/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.czucwy supports timestamps until 2038 (0x7fffffff)
Dec  6 04:41:56 np0005548918 podman[84300]: 2025-12-06 09:41:56.234657287 +0000 UTC m=+0.030149181 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:41:56 np0005548918 podman[84300]: 2025-12-06 09:41:56.334585073 +0000 UTC m=+0.130076947 container init f7e5fe6f27c4186e3d87d6a775f55e4ff8479703f7103af0416838c801936627 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mds-cephfs-compute-2-czucwy, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec  6 04:41:56 np0005548918 podman[84300]: 2025-12-06 09:41:56.344168974 +0000 UTC m=+0.139660818 container start f7e5fe6f27c4186e3d87d6a775f55e4ff8479703f7103af0416838c801936627 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mds-cephfs-compute-2-czucwy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Dec  6 04:41:56 np0005548918 bash[84300]: f7e5fe6f27c4186e3d87d6a775f55e4ff8479703f7103af0416838c801936627
Dec  6 04:41:56 np0005548918 systemd[1]: Started Ceph mds.cephfs.compute-2.czucwy for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: set uid:gid to 167:167 (ceph:ceph)
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: main not setting numa affinity
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: pidfile_write: ignore empty --pid-file
Dec  6 04:41:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mds-cephfs-compute-2-czucwy[84315]: starting mds.cephfs.compute-2.czucwy at 
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy Updating MDS map to version 2 from mon.1
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.ujokui", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.ujokui", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: Deploying daemon mds.cephfs.compute-0.ujokui on compute-0
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e3 new map
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e3 print_map#012e3#012btime 2025-12-06T09:41:56:804272+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:41:29.967778+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.czucwy{-1:24274} state up:standby seq 1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy Updating MDS map to version 3 from mon.1
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy Monitors have assigned me to become a standby
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e4 new map
Dec  6 04:41:56 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e4 print_map#012e4#012btime 2025-12-06T09:41:56:835698+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:41:56.835690+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24274}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-2.czucwy{0:24274} state up:creating seq 1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy Updating MDS map to version 4 from mon.1
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.4 handle_mds_map I am now mds.0.4
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.cache creating system inode with ino:0x1
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.cache creating system inode with ino:0x100
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.cache creating system inode with ino:0x600
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.cache creating system inode with ino:0x601
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.cache creating system inode with ino:0x602
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.cache creating system inode with ino:0x603
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.cache creating system inode with ino:0x604
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.cache creating system inode with ino:0x605
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.cache creating system inode with ino:0x606
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.cache creating system inode with ino:0x607
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.cache creating system inode with ino:0x608
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.cache creating system inode with ino:0x609
Dec  6 04:41:56 np0005548918 ceph-mds[84319]: mds.0.4 creating_done
Dec  6 04:41:57 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Dec  6 04:41:57 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Dec  6 04:41:57 np0005548918 ceph-mon[75798]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  6 04:41:57 np0005548918 ceph-mon[75798]: daemon mds.cephfs.compute-2.czucwy assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Dec  6 04:41:57 np0005548918 ceph-mon[75798]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Dec  6 04:41:57 np0005548918 ceph-mon[75798]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Dec  6 04:41:57 np0005548918 ceph-mon[75798]: Cluster is now healthy
Dec  6 04:41:57 np0005548918 ceph-mon[75798]: daemon mds.cephfs.compute-2.czucwy is now active in filesystem cephfs as rank 0
Dec  6 04:41:57 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e5 new map
Dec  6 04:41:57 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e5 print_map#012e5#012btime 2025-12-06T09:41:57:856282+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:41:57.856277+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24274}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24274 members: 24274#012[mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 2 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Dec  6 04:41:57 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy Updating MDS map to version 5 from mon.1
Dec  6 04:41:57 np0005548918 ceph-mds[84319]: mds.0.4 handle_mds_map I am now mds.0.4
Dec  6 04:41:57 np0005548918 ceph-mds[84319]: mds.0.4 handle_mds_map state change up:creating --> up:active
Dec  6 04:41:57 np0005548918 ceph-mds[84319]: mds.0.4 recovery_done -- successful recovery!
Dec  6 04:41:57 np0005548918 ceph-mds[84319]: mds.0.4 active_start
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.fpvjgb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.fpvjgb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: Deploying daemon mds.cephfs.compute-1.fpvjgb on compute-1
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e6 new map
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e6 print_map#012e6#012btime 2025-12-06T09:41:58:872230+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:41:57.856277+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24274}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24274 members: 24274#012[mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 2 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e7 new map
Dec  6 04:41:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e7 print_map#012e7#012btime 2025-12-06T09:41:58:889029+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:41:57.856277+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24274}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24274 members: 24274#012[mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 2 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]
Dec  6 04:41:59 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Dec  6 04:41:59 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  6 04:41:59 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  6 04:41:59 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  6 04:41:59 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  6 04:41:59 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  6 04:41:59 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:41:59 np0005548918 ceph-mon[75798]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec  6 04:41:59 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec  6 04:41:59 np0005548918 ceph-mon[75798]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec  6 04:42:00 np0005548918 radosgw[83463]: v1 topic migration: starting v1 topic migration..
Dec  6 04:42:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-rgw-rgw-compute-2-qizhkr[83459]: 2025-12-06T09:42:00.106+0000 7f89165ea980 -1 LDAP not started since no server URIs were provided in the configuration.
Dec  6 04:42:00 np0005548918 radosgw[83463]: LDAP not started since no server URIs were provided in the configuration.
Dec  6 04:42:00 np0005548918 radosgw[83463]: v1 topic migration: finished v1 topic migration
Dec  6 04:42:00 np0005548918 radosgw[83463]: framework: beast
Dec  6 04:42:00 np0005548918 radosgw[83463]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Dec  6 04:42:00 np0005548918 radosgw[83463]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Dec  6 04:42:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Dec  6 04:42:00 np0005548918 radosgw[83463]: starting handler: beast
Dec  6 04:42:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Dec  6 04:42:00 np0005548918 radosgw[83463]: set uid:gid to 167:167 (ceph:ceph)
Dec  6 04:42:00 np0005548918 radosgw[83463]: mgrc service_daemon_register rgw.24247 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.qizhkr,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025,kernel_version=5.14.0-645.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=d81f60a3-cfd4-40b3-a809-ad3aae1b1fd0,zone_name=default,zonegroup_id=75773215-ab74-4afd-a4c0-f777a01e4a1a,zonegroup_name=default}
Dec  6 04:42:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Dec  6 04:42:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Dec  6 04:42:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e8 new map
Dec  6 04:42:00 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy Updating MDS map to version 8 from mon.1
Dec  6 04:42:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e8 print_map#012e8#012btime 2025-12-06T09:42:00:908587+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:42:00.880325+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24274}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24274 members: 24274#012[mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.fpvjgb{-1:24215} state up:standby seq 1 addr [v2:192.168.122.101:6804/2619956440,v1:192.168.122.101:6805/2619956440] compat {c=[1],r=[1],i=[1fff]}]
Dec  6 04:42:01 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:01 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:01 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:01 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:01 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:01 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:01 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.djsnbu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec  6 04:42:01 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.djsnbu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec  6 04:42:01 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec  6 04:42:01 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec  6 04:42:01 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec  6 04:42:01 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec  6 04:42:01 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.djsnbu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  6 04:42:01 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.djsnbu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  6 04:42:01 np0005548918 ceph-mds[84319]: mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Dec  6 04:42:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mds-cephfs-compute-2-czucwy[84315]: 2025-12-06T09:42:01.854+0000 7f5d9c766640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Dec  6 04:42:02 np0005548918 ceph-mon[75798]: Creating key for client.nfs.cephfs.0.0.compute-1.djsnbu
Dec  6 04:42:02 np0005548918 ceph-mon[75798]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Dec  6 04:42:02 np0005548918 ceph-mon[75798]: Rados config object exists: conf-nfs.cephfs
Dec  6 04:42:02 np0005548918 ceph-mon[75798]: Creating key for client.nfs.cephfs.0.0.compute-1.djsnbu-rgw
Dec  6 04:42:02 np0005548918 ceph-mon[75798]: Bind address in nfs.cephfs.0.0.compute-1.djsnbu's ganesha conf is defaulting to empty
Dec  6 04:42:02 np0005548918 ceph-mon[75798]: Deploying daemon nfs.cephfs.0.0.compute-1.djsnbu on compute-1
Dec  6 04:42:02 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e9 new map
Dec  6 04:42:02 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e9 print_map#012e9#012btime 2025-12-06T09:42:02:933823+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:42:00.880325+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24274}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24274 members: 24274#012[mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.fpvjgb{-1:24215} state up:standby seq 1 addr [v2:192.168.122.101:6804/2619956440,v1:192.168.122.101:6805/2619956440] compat {c=[1],r=[1],i=[1fff]}]
Dec  6 04:42:03 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:03 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:03 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:03 np0005548918 ceph-mon[75798]: Creating key for client.nfs.cephfs.1.0.compute-2.sseuqb
Dec  6 04:42:03 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.sseuqb", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec  6 04:42:03 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.sseuqb", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec  6 04:42:03 np0005548918 ceph-mon[75798]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Dec  6 04:42:03 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec  6 04:42:03 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec  6 04:42:05 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e10 new map
Dec  6 04:42:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).mds e10 print_map#012e10#012btime 2025-12-06T09:42:05:044345+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-06T09:41:29.967778+0000#012modified#0112025-12-06T09:42:00.880325+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24274}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24274 members: 24274#012[mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.fpvjgb{-1:24215} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/2619956440,v1:192.168.122.101:6805/2619956440] compat {c=[1],r=[1],i=[1fff]}]
Dec  6 04:42:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:07 np0005548918 podman[84479]: 2025-12-06 09:42:07.007524854 +0000 UTC m=+0.053245676 container create 696325b5508a896b9992849c36dc43af970380a90a87c8fcbec860fbd205a95a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=recursing_kapitsa, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:42:07 np0005548918 systemd[1]: Started libpod-conmon-696325b5508a896b9992849c36dc43af970380a90a87c8fcbec860fbd205a95a.scope.
Dec  6 04:42:07 np0005548918 podman[84479]: 2025-12-06 09:42:06.978426635 +0000 UTC m=+0.024147537 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:42:07 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:42:07 np0005548918 podman[84479]: 2025-12-06 09:42:07.100777319 +0000 UTC m=+0.146498161 container init 696325b5508a896b9992849c36dc43af970380a90a87c8fcbec860fbd205a95a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=recursing_kapitsa, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:42:07 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec  6 04:42:07 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec  6 04:42:07 np0005548918 ceph-mon[75798]: Rados config object exists: conf-nfs.cephfs
Dec  6 04:42:07 np0005548918 ceph-mon[75798]: Creating key for client.nfs.cephfs.1.0.compute-2.sseuqb-rgw
Dec  6 04:42:07 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.sseuqb-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  6 04:42:07 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.sseuqb-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  6 04:42:07 np0005548918 ceph-mon[75798]: Bind address in nfs.cephfs.1.0.compute-2.sseuqb's ganesha conf is defaulting to empty
Dec  6 04:42:07 np0005548918 ceph-mon[75798]: Deploying daemon nfs.cephfs.1.0.compute-2.sseuqb on compute-2
Dec  6 04:42:07 np0005548918 podman[84479]: 2025-12-06 09:42:07.114801564 +0000 UTC m=+0.160522406 container start 696325b5508a896b9992849c36dc43af970380a90a87c8fcbec860fbd205a95a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=recursing_kapitsa, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:42:07 np0005548918 podman[84479]: 2025-12-06 09:42:07.119715245 +0000 UTC m=+0.165436157 container attach 696325b5508a896b9992849c36dc43af970380a90a87c8fcbec860fbd205a95a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=recursing_kapitsa, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 04:42:07 np0005548918 recursing_kapitsa[84495]: 167 167
Dec  6 04:42:07 np0005548918 systemd[1]: libpod-696325b5508a896b9992849c36dc43af970380a90a87c8fcbec860fbd205a95a.scope: Deactivated successfully.
Dec  6 04:42:07 np0005548918 conmon[84495]: conmon 696325b5508a896b9992 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-696325b5508a896b9992849c36dc43af970380a90a87c8fcbec860fbd205a95a.scope/container/memory.events
Dec  6 04:42:07 np0005548918 podman[84479]: 2025-12-06 09:42:07.123899848 +0000 UTC m=+0.169620660 container died 696325b5508a896b9992849c36dc43af970380a90a87c8fcbec860fbd205a95a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=recursing_kapitsa, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:42:07 np0005548918 systemd[1]: var-lib-containers-storage-overlay-2cf40fe829933b399053a174730f3216829bba1dc7a9166b9e2bcc4e1eb68884-merged.mount: Deactivated successfully.
Dec  6 04:42:07 np0005548918 podman[84479]: 2025-12-06 09:42:07.159907412 +0000 UTC m=+0.205628224 container remove 696325b5508a896b9992849c36dc43af970380a90a87c8fcbec860fbd205a95a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=recursing_kapitsa, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1)
Dec  6 04:42:07 np0005548918 systemd[1]: libpod-conmon-696325b5508a896b9992849c36dc43af970380a90a87c8fcbec860fbd205a95a.scope: Deactivated successfully.
Dec  6 04:42:07 np0005548918 systemd[1]: Reloading.
Dec  6 04:42:07 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:42:07 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:42:07 np0005548918 systemd[1]: Reloading.
Dec  6 04:42:07 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:42:07 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:42:07 np0005548918 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:42:08 np0005548918 podman[84635]: 2025-12-06 09:42:08.036545808 +0000 UTC m=+0.047447221 container create a3634fe4060dc94c2c20aff61ae4ab07f3ae7c7af9e41801a8e759fad2a4f938 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:42:08 np0005548918 podman[84635]: 2025-12-06 09:42:08.014438256 +0000 UTC m=+0.025339669 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:42:08 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e0607e97819625324c74760b6caea3cab529c87137413bc124ce71f1d8ada59/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:08 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e0607e97819625324c74760b6caea3cab529c87137413bc124ce71f1d8ada59/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:08 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e0607e97819625324c74760b6caea3cab529c87137413bc124ce71f1d8ada59/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:08 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e0607e97819625324c74760b6caea3cab529c87137413bc124ce71f1d8ada59/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.sseuqb-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:08 np0005548918 podman[84635]: 2025-12-06 09:42:08.132976777 +0000 UTC m=+0.143878190 container init a3634fe4060dc94c2c20aff61ae4ab07f3ae7c7af9e41801a8e759fad2a4f938 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec  6 04:42:08 np0005548918 podman[84635]: 2025-12-06 09:42:08.1487466 +0000 UTC m=+0.159647983 container start a3634fe4060dc94c2c20aff61ae4ab07f3ae7c7af9e41801a8e759fad2a4f938 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True)
Dec  6 04:42:08 np0005548918 bash[84635]: a3634fe4060dc94c2c20aff61ae4ab07f3ae7c7af9e41801a8e759fad2a4f938
Dec  6 04:42:08 np0005548918 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000002:nfs.cephfs.1: -2
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 04:42:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:42:08 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:08 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:08 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:08 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.dfwxck", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec  6 04:42:08 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.dfwxck", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec  6 04:42:08 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec  6 04:42:08 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec  6 04:42:09 np0005548918 ceph-mon[75798]: Creating key for client.nfs.cephfs.2.0.compute-0.dfwxck
Dec  6 04:42:09 np0005548918 ceph-mon[75798]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Dec  6 04:42:09 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec  6 04:42:09 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec  6 04:42:09 np0005548918 ceph-mon[75798]: Rados config object exists: conf-nfs.cephfs
Dec  6 04:42:09 np0005548918 ceph-mon[75798]: Creating key for client.nfs.cephfs.2.0.compute-0.dfwxck-rgw
Dec  6 04:42:09 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.dfwxck-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  6 04:42:09 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.dfwxck-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  6 04:42:09 np0005548918 ceph-mon[75798]: Bind address in nfs.cephfs.2.0.compute-0.dfwxck's ganesha conf is defaulting to empty
Dec  6 04:42:09 np0005548918 ceph-mon[75798]: Deploying daemon nfs.cephfs.2.0.compute-0.dfwxck on compute-0
Dec  6 04:42:09 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:10 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:42:10 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:10 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:10 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:42:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:10 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:42:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:10 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:42:11 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:11 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:11 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:11 np0005548918 ceph-mon[75798]: Deploying daemon haproxy.nfs.cephfs.compute-1.jmdafd on compute-1
Dec  6 04:42:14 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:17 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:18 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:18 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:18 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:18 np0005548918 ceph-mon[75798]: Deploying daemon haproxy.nfs.cephfs.compute-0.fzuvue on compute-0
Dec  6 04:42:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:19 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0014d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:21 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:22 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:22 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:23 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:23 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:23 np0005548918 ceph-mon[75798]: Deploying daemon haproxy.nfs.cephfs.compute-2.voodna on compute-2
Dec  6 04:42:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:23 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:25 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:25 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb80016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:26 np0005548918 podman[84804]: 2025-12-06 09:42:26.725885972 +0000 UTC m=+3.292362333 container create 609360bc8fc70091035c6ef6b4b28639ce992f8337d274c6ef81e98ac7611597 (image=quay.io/ceph/haproxy:2.3, name=loving_margulis)
Dec  6 04:42:26 np0005548918 systemd[1]: Started libpod-conmon-609360bc8fc70091035c6ef6b4b28639ce992f8337d274c6ef81e98ac7611597.scope.
Dec  6 04:42:26 np0005548918 podman[84804]: 2025-12-06 09:42:26.708956089 +0000 UTC m=+3.275432480 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec  6 04:42:26 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:42:26 np0005548918 podman[84804]: 2025-12-06 09:42:26.831106518 +0000 UTC m=+3.397582949 container init 609360bc8fc70091035c6ef6b4b28639ce992f8337d274c6ef81e98ac7611597 (image=quay.io/ceph/haproxy:2.3, name=loving_margulis)
Dec  6 04:42:26 np0005548918 podman[84804]: 2025-12-06 09:42:26.839002258 +0000 UTC m=+3.405478629 container start 609360bc8fc70091035c6ef6b4b28639ce992f8337d274c6ef81e98ac7611597 (image=quay.io/ceph/haproxy:2.3, name=loving_margulis)
Dec  6 04:42:26 np0005548918 podman[84804]: 2025-12-06 09:42:26.843868459 +0000 UTC m=+3.410344900 container attach 609360bc8fc70091035c6ef6b4b28639ce992f8337d274c6ef81e98ac7611597 (image=quay.io/ceph/haproxy:2.3, name=loving_margulis)
Dec  6 04:42:26 np0005548918 systemd[1]: libpod-609360bc8fc70091035c6ef6b4b28639ce992f8337d274c6ef81e98ac7611597.scope: Deactivated successfully.
Dec  6 04:42:26 np0005548918 loving_margulis[84917]: 0 0
Dec  6 04:42:26 np0005548918 conmon[84917]: conmon 609360bc8fc70091035c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-609360bc8fc70091035c6ef6b4b28639ce992f8337d274c6ef81e98ac7611597.scope/container/memory.events
Dec  6 04:42:26 np0005548918 podman[84804]: 2025-12-06 09:42:26.846441118 +0000 UTC m=+3.412917479 container died 609360bc8fc70091035c6ef6b4b28639ce992f8337d274c6ef81e98ac7611597 (image=quay.io/ceph/haproxy:2.3, name=loving_margulis)
Dec  6 04:42:26 np0005548918 systemd[1]: var-lib-containers-storage-overlay-47e651877768a2b5dcfc2cb6046d9469fb3f12120166af3801679e5e7d1420a9-merged.mount: Deactivated successfully.
Dec  6 04:42:26 np0005548918 podman[84804]: 2025-12-06 09:42:26.900633438 +0000 UTC m=+3.467109789 container remove 609360bc8fc70091035c6ef6b4b28639ce992f8337d274c6ef81e98ac7611597 (image=quay.io/ceph/haproxy:2.3, name=loving_margulis)
Dec  6 04:42:26 np0005548918 systemd[1]: libpod-conmon-609360bc8fc70091035c6ef6b4b28639ce992f8337d274c6ef81e98ac7611597.scope: Deactivated successfully.
Dec  6 04:42:26 np0005548918 systemd[1]: Reloading.
Dec  6 04:42:27 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:42:27 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:42:27 np0005548918 systemd[1]: Reloading.
Dec  6 04:42:27 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:42:27 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:42:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:27 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:27 np0005548918 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-2.voodna for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:42:27 np0005548918 podman[85062]: 2025-12-06 09:42:27.731424527 +0000 UTC m=+0.056963555 container create 291e33d7558df1250bc1d75586903aba6000ccad9dd3cb120f4999944db31c98 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna)
Dec  6 04:42:27 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00b670fcb97840ca17918ff3351369082afeb2c6a25686dce6667b1fc31f4d13/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:27 np0005548918 podman[85062]: 2025-12-06 09:42:27.702783461 +0000 UTC m=+0.028322559 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec  6 04:42:27 np0005548918 podman[85062]: 2025-12-06 09:42:27.805837178 +0000 UTC m=+0.131376196 container init 291e33d7558df1250bc1d75586903aba6000ccad9dd3cb120f4999944db31c98 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna)
Dec  6 04:42:27 np0005548918 podman[85062]: 2025-12-06 09:42:27.810480542 +0000 UTC m=+0.136019540 container start 291e33d7558df1250bc1d75586903aba6000ccad9dd3cb120f4999944db31c98 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna)
Dec  6 04:42:27 np0005548918 bash[85062]: 291e33d7558df1250bc1d75586903aba6000ccad9dd3cb120f4999944db31c98
Dec  6 04:42:27 np0005548918 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-2.voodna for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:42:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [NOTICE] 339/094227 (2) : New worker #1 (4) forked
Dec  6 04:42:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:27 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd80023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:28 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:28 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:28 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:28 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:28 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:29 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb80016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Dec  6 04:42:29 np0005548918 ceph-mon[75798]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec  6 04:42:29 np0005548918 ceph-mon[75798]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  6 04:42:29 np0005548918 ceph-mon[75798]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  6 04:42:29 np0005548918 ceph-mon[75798]: Deploying daemon keepalived.nfs.cephfs.compute-1.uzbtlt on compute-1
Dec  6 04:42:29 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]: dispatch
Dec  6 04:42:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:29 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Dec  6 04:42:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:30 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd80023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:30 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Dec  6 04:42:30 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Dec  6 04:42:30 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Dec  6 04:42:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:31 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:31 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Dec  6 04:42:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:31 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb80016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:32 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:32 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Dec  6 04:42:32 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Dec  6 04:42:32 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:32 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Dec  6 04:42:32 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Dec  6 04:42:32 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec  6 04:42:32 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Dec  6 04:42:32 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Dec  6 04:42:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:33 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd80023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:33 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Dec  6 04:42:33 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Dec  6 04:42:33 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:33 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Dec  6 04:42:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:33 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:34 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:34 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Dec  6 04:42:34 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Dec  6 04:42:34 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Dec  6 04:42:34 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Dec  6 04:42:34 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:34 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:34 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Dec  6 04:42:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:35 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[7.16( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[7.1d( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[7.11( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[7.14( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[7.1f( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[7.5( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[7.a( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-mon[75798]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  6 04:42:35 np0005548918 ceph-mon[75798]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec  6 04:42:35 np0005548918 ceph-mon[75798]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  6 04:42:35 np0005548918 ceph-mon[75798]: Deploying daemon keepalived.nfs.cephfs.compute-0.ylrrzf on compute-0
Dec  6 04:42:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Dec  6 04:42:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Dec  6 04:42:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:35 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]: dispatch
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.3( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.15( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[6.1( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.f( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[9.9( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[6.7( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.9( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[9.8( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.a( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[9.b( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[6.3( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.d( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.c( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[6.d( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[6.5( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.b( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[6.f( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[6.9( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[9.18( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.1c( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[9.1d( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.6( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[9.13( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[9.7( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.2( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.5( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[9.3( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[9.5( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.11( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[9.17( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.16( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[8.1f( empty local-lis/les=0/0 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[9.16( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 58 pg[6.b( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:35 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd80034e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:36 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:36 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Dec  6 04:42:36 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Dec  6 04:42:36 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Dec  6 04:42:36 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:42:36 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:42:36 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:42:36 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]': finished
Dec  6 04:42:36 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Dec  6 04:42:36 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.15( v 51'44 (0'0,51'44] local-lis/les=58/59 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[6.1( v 50'39 (0'0,50'39] local-lis/les=58/59 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[6.d( v 50'39 lc 48'13 (0'0,50'39] local-lis/les=58/59 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.f( v 51'44 (0'0,51'44] local-lis/les=58/59 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[9.1d( v 44'12 (0'0,44'12] local-lis/les=58/59 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.1c( v 51'44 (0'0,51'44] local-lis/les=58/59 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[9.13( v 44'12 (0'0,44'12] local-lis/les=58/59 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.c( v 51'44 (0'0,51'44] local-lis/les=58/59 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[9.9( v 44'12 (0'0,44'12] local-lis/les=58/59 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.3( v 51'44 (0'0,51'44] local-lis/les=58/59 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[7.1d( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.9( v 51'44 (0'0,51'44] local-lis/les=58/59 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[6.7( v 50'39 lc 48'20 (0'0,50'39] local-lis/les=58/59 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[7.1f( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.a( v 51'44 (0'0,51'44] local-lis/les=58/59 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[9.b( v 44'12 (0'0,44'12] local-lis/les=58/59 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.11( v 51'44 (0'0,51'44] local-lis/les=58/59 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.d( v 51'44 lc 51'19 (0'0,51'44] local-lis/les=58/59 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[7.5( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.b( v 51'44 (0'0,51'44] local-lis/les=58/59 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[6.3( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=58/59 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=50'39 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[7.11( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.5( v 51'44 (0'0,51'44] local-lis/les=58/59 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.1f( v 51'44 (0'0,51'44] local-lis/les=58/59 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[6.b( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=58/59 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=50'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[6.5( v 50'39 lc 48'11 (0'0,50'39] local-lis/les=58/59 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[7.a( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[9.7( v 44'12 (0'0,44'12] local-lis/les=58/59 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.6( v 51'44 (0'0,51'44] local-lis/les=58/59 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[6.9( v 50'39 (0'0,50'39] local-lis/les=58/59 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[9.18( v 44'12 lc 44'1 (0'0,44'12] local-lis/les=58/59 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[7.16( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[9.8( v 44'12 (0'0,44'12] local-lis/les=58/59 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[9.5( v 44'12 (0'0,44'12] local-lis/les=58/59 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[6.f( v 50'39 lc 48'1 (0'0,50'39] local-lis/les=58/59 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[9.3( v 44'12 (0'0,44'12] local-lis/les=58/59 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.16( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=58/59 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[9.17( v 44'12 (0'0,44'12] local-lis/les=58/59 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[9.16( v 44'12 (0'0,44'12] local-lis/les=58/59 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[7.14( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 59 pg[8.2( v 51'44 (0'0,51'44] local-lis/les=58/59 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:37 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:37 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:37 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Dec  6 04:42:38 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:38 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:39 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Dec  6 04:42:39 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Dec  6 04:42:39 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 6.f scrub starts
Dec  6 04:42:39 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 6.f scrub ok
Dec  6 04:42:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:39 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:39 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:40 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:40 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:40 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:40 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:40 np0005548918 ceph-mon[75798]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  6 04:42:40 np0005548918 ceph-mon[75798]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  6 04:42:40 np0005548918 ceph-mon[75798]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec  6 04:42:40 np0005548918 ceph-mon[75798]: Deploying daemon keepalived.nfs.cephfs.compute-2.whsrlg on compute-2
Dec  6 04:42:40 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.d scrub starts
Dec  6 04:42:40 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.d scrub ok
Dec  6 04:42:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:40 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e61 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:41 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[12.11( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.17( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[12.13( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.15( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.13( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[12.4( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.f( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[12.9( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.b( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.3( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[12.3( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.5( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[12.2( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.19( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[12.1a( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.1d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[12.18( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.1f( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.9( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[12.7( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.7( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.1( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[12.1e( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[12.1d( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[10.11( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[12.17( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[11.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[11.17( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[11.a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[11.13( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[11.e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[11.8( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[11.3( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 62 pg[11.19( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:41 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]: dispatch
Dec  6 04:42:41 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:41 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Dec  6 04:42:41 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Dec  6 04:42:41 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Dec  6 04:42:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:41 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:41 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.16 scrub starts
Dec  6 04:42:42 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.1f( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.1f( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.11( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.17( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.11( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.17( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.13( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.13( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.1( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.1( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.f( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.9( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.f( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.9( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.7( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.1d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.7( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.1d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.5( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.5( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.b( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.b( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.3( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.3( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.15( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.15( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.19( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[10.19( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=63) [2]/[1] r=-1 lpr=63 pi=[58,63)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[11.17( v 48'48 (0'0,48'48] local-lis/les=62/63 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[11.13( v 48'48 (0'0,48'48] local-lis/les=62/63 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.16 scrub ok
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[11.a( v 48'48 (0'0,48'48] local-lis/les=62/63 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[12.18( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[11.16( v 48'48 (0'0,48'48] local-lis/les=62/63 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[12.17( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[12.7( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[12.11( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[12.9( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[11.e( v 61'51 lc 48'26 (0'0,61'51] local-lis/les=62/63 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=61'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[11.8( v 48'48 (0'0,48'48] local-lis/les=62/63 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[12.1a( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[12.2( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[11.19( v 48'48 (0'0,48'48] local-lis/les=62/63 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[12.3( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[12.4( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[12.1d( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[11.3( v 61'51 lc 48'38 (0'0,61'51] local-lis/les=62/63 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=61'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[12.13( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 63 pg[12.1e( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:42 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]': finished
Dec  6 04:42:42 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:42:42 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Dec  6 04:42:42 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:42:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:42 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:42 np0005548918 podman[85184]: 2025-12-06 09:42:42.68929777 +0000 UTC m=+2.765082476 container create 568b4d033ef2ee2b134969de87e1b2eb24016961b4b69de014882c1a5937f397 (image=quay.io/ceph/keepalived:2.2.4, name=objective_satoshi, io.openshift.expose-services=, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, release=1793, distribution-scope=public, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived)
Dec  6 04:42:42 np0005548918 systemd[1]: Started libpod-conmon-568b4d033ef2ee2b134969de87e1b2eb24016961b4b69de014882c1a5937f397.scope.
Dec  6 04:42:42 np0005548918 podman[85184]: 2025-12-06 09:42:42.672922762 +0000 UTC m=+2.748707508 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec  6 04:42:42 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:42:42 np0005548918 podman[85184]: 2025-12-06 09:42:42.781645741 +0000 UTC m=+2.857430447 container init 568b4d033ef2ee2b134969de87e1b2eb24016961b4b69de014882c1a5937f397 (image=quay.io/ceph/keepalived:2.2.4, name=objective_satoshi, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, io.buildah.version=1.28.2, io.openshift.expose-services=, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vcs-type=git, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec  6 04:42:42 np0005548918 podman[85184]: 2025-12-06 09:42:42.78870285 +0000 UTC m=+2.864487556 container start 568b4d033ef2ee2b134969de87e1b2eb24016961b4b69de014882c1a5937f397 (image=quay.io/ceph/keepalived:2.2.4, name=objective_satoshi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, com.redhat.component=keepalived-container, release=1793, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, io.openshift.expose-services=, version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Dec  6 04:42:42 np0005548918 podman[85184]: 2025-12-06 09:42:42.791936196 +0000 UTC m=+2.867720902 container attach 568b4d033ef2ee2b134969de87e1b2eb24016961b4b69de014882c1a5937f397 (image=quay.io/ceph/keepalived:2.2.4, name=objective_satoshi, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, name=keepalived, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, vcs-type=git, version=2.2.4, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph.)
Dec  6 04:42:42 np0005548918 objective_satoshi[85279]: 0 0
Dec  6 04:42:42 np0005548918 systemd[1]: libpod-568b4d033ef2ee2b134969de87e1b2eb24016961b4b69de014882c1a5937f397.scope: Deactivated successfully.
Dec  6 04:42:42 np0005548918 podman[85184]: 2025-12-06 09:42:42.794793273 +0000 UTC m=+2.870577979 container died 568b4d033ef2ee2b134969de87e1b2eb24016961b4b69de014882c1a5937f397 (image=quay.io/ceph/keepalived:2.2.4, name=objective_satoshi, release=1793, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, version=2.2.4)
Dec  6 04:42:42 np0005548918 systemd[1]: var-lib-containers-storage-overlay-93a16f5d9c462603081035cb744ebb7010f8d07cca57095080be4f8f2591c1aa-merged.mount: Deactivated successfully.
Dec  6 04:42:42 np0005548918 podman[85184]: 2025-12-06 09:42:42.837358912 +0000 UTC m=+2.913143608 container remove 568b4d033ef2ee2b134969de87e1b2eb24016961b4b69de014882c1a5937f397 (image=quay.io/ceph/keepalived:2.2.4, name=objective_satoshi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.openshift.expose-services=, description=keepalived for Ceph, name=keepalived, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., version=2.2.4, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20)
Dec  6 04:42:42 np0005548918 systemd[1]: libpod-conmon-568b4d033ef2ee2b134969de87e1b2eb24016961b4b69de014882c1a5937f397.scope: Deactivated successfully.
Dec  6 04:42:42 np0005548918 systemd[1]: Reloading.
Dec  6 04:42:42 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:42:42 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:42:43 np0005548918 systemd[1]: Reloading.
Dec  6 04:42:43 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Dec  6 04:42:43 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 64 pg[6.3( v 50'39 (0'0,50'39] local-lis/les=58/59 n=2 ec=54/21 lis/c=58/58 les/c/f=59/59/0 sis=64 pruub=9.803589821s) [1] r=-1 lpr=64 pi=[58,64)/1 crt=50'39 mlcod 50'39 active pruub 110.257621765s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:43 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 64 pg[6.3( v 50'39 (0'0,50'39] local-lis/les=58/59 n=2 ec=54/21 lis/c=58/58 les/c/f=59/59/0 sis=64 pruub=9.803525925s) [1] r=-1 lpr=64 pi=[58,64)/1 crt=50'39 mlcod 0'0 unknown NOTIFY pruub 110.257621765s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:43 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 64 pg[6.b( v 50'39 (0'0,50'39] local-lis/les=58/59 n=1 ec=54/21 lis/c=58/58 les/c/f=59/59/0 sis=64 pruub=9.803431511s) [1] r=-1 lpr=64 pi=[58,64)/1 crt=50'39 mlcod 50'39 active pruub 110.257667542s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:43 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 64 pg[6.b( v 50'39 (0'0,50'39] local-lis/les=58/59 n=1 ec=54/21 lis/c=58/58 les/c/f=59/59/0 sis=64 pruub=9.803386688s) [1] r=-1 lpr=64 pi=[58,64)/1 crt=50'39 mlcod 0'0 unknown NOTIFY pruub 110.257667542s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:43 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 64 pg[6.7( v 50'39 (0'0,50'39] local-lis/les=58/59 n=1 ec=54/21 lis/c=58/58 les/c/f=59/59/0 sis=64 pruub=9.802788734s) [1] r=-1 lpr=64 pi=[58,64)/1 crt=50'39 mlcod 50'39 active pruub 110.257369995s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:43 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 64 pg[6.7( v 50'39 (0'0,50'39] local-lis/les=58/59 n=1 ec=54/21 lis/c=58/58 les/c/f=59/59/0 sis=64 pruub=9.802760124s) [1] r=-1 lpr=64 pi=[58,64)/1 crt=50'39 mlcod 0'0 unknown NOTIFY pruub 110.257369995s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:43 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 64 pg[6.f( v 50'39 (0'0,50'39] local-lis/les=58/59 n=3 ec=54/21 lis/c=58/58 les/c/f=59/60/0 sis=64 pruub=9.803228378s) [1] r=-1 lpr=64 pi=[58,64)/1 crt=50'39 mlcod 50'39 active pruub 110.257911682s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:43 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 64 pg[6.f( v 50'39 (0'0,50'39] local-lis/les=58/59 n=3 ec=54/21 lis/c=58/58 les/c/f=59/60/0 sis=64 pruub=9.803194046s) [1] r=-1 lpr=64 pi=[58,64)/1 crt=50'39 mlcod 0'0 unknown NOTIFY pruub 110.257911682s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:43 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:42:43 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:42:43 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]: dispatch
Dec  6 04:42:43 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Dec  6 04:42:43 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]': finished
Dec  6 04:42:43 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Dec  6 04:42:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:43 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:43 np0005548918 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-2.whsrlg for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:42:43 np0005548918 podman[85424]: 2025-12-06 09:42:43.661042981 +0000 UTC m=+0.052606629 container create cbcabdb9b139bf7198b15438accb8f4a51fb667fdf4f19be3cdf7b28a8213220 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg, io.k8s.display-name=Keepalived on RHEL 9, release=1793, name=keepalived, io.buildah.version=1.28.2, version=2.2.4, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., vcs-type=git, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, architecture=x86_64, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph.)
Dec  6 04:42:43 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2572da8f9f1c518a0afaa634b6e3d84645b909807d179231135d67583284a1e9/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:42:43 np0005548918 podman[85424]: 2025-12-06 09:42:43.70957915 +0000 UTC m=+0.101142778 container init cbcabdb9b139bf7198b15438accb8f4a51fb667fdf4f19be3cdf7b28a8213220 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.28.2, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, release=1793, name=keepalived, build-date=2023-02-22T09:23:20, architecture=x86_64, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container)
Dec  6 04:42:43 np0005548918 podman[85424]: 2025-12-06 09:42:43.713667449 +0000 UTC m=+0.105231067 container start cbcabdb9b139bf7198b15438accb8f4a51fb667fdf4f19be3cdf7b28a8213220 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, release=1793, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  6 04:42:43 np0005548918 bash[85424]: cbcabdb9b139bf7198b15438accb8f4a51fb667fdf4f19be3cdf7b28a8213220
Dec  6 04:42:43 np0005548918 podman[85424]: 2025-12-06 09:42:43.632908068 +0000 UTC m=+0.024471766 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec  6 04:42:43 np0005548918 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-2.whsrlg for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:42:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:42:43 2025: Starting Keepalived v2.2.4 (08/21,2021)
Dec  6 04:42:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:42:43 2025: Running on Linux 5.14.0-645.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025 (built for Linux 5.14.0)
Dec  6 04:42:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:42:43 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Dec  6 04:42:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:42:43 2025: Configuration file /etc/keepalived/keepalived.conf
Dec  6 04:42:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:42:43 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Dec  6 04:42:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:42:43 2025: Starting VRRP child process, pid=4
Dec  6 04:42:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:42:43 2025: Startup complete
Dec  6 04:42:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:42:43 2025: (VI_0) Entering BACKUP STATE (init)
Dec  6 04:42:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:42:43 2025: VRRP_Script(check_backend) succeeded
Dec  6 04:42:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:43 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.5( v 64'1034 (0'0,64'1034] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=61'1030 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.5( v 64'1034 (0'0,64'1034] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=61'1030 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.b( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.b( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.f( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.f( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.17( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.17( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.11( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.11( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.1( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.1( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.9( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.9( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.7( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.7( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.3( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:44 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 65 pg[10.3( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:44 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:44 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:44 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:44 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:44 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:44 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.11( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.1( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.7( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.9( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.f( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.3( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.5( v 64'1034 (0'0,64'1034] local-lis/les=65/66 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=64'1034 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.b( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 66 pg[10.17( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=63/58 les/c/f=64/59/0 sis=65) [2] r=0 lpr=65 pi=[58,65)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:45 np0005548918 ceph-mon[75798]: Deploying daemon alertmanager.compute-0 on compute-0
Dec  6 04:42:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Dec  6 04:42:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:45 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:45 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:46 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 10.1f scrub starts
Dec  6 04:42:46 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 10.1f scrub ok
Dec  6 04:42:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Dec  6 04:42:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:46 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:47 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Dec  6 04:42:47 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Dec  6 04:42:47 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Dec  6 04:42:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:47 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:42:47 2025: (VI_0) Entering MASTER STATE
Dec  6 04:42:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:42:47 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Dec  6 04:42:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:42:47 2025: (VI_0) Entering BACKUP STATE
Dec  6 04:42:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:47 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:48 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 10.13 scrub starts
Dec  6 04:42:48 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 10.13 scrub ok
Dec  6 04:42:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:48 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:49 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Dec  6 04:42:49 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Dec  6 04:42:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:49 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:49 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:50 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.f scrub starts
Dec  6 04:42:50 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.f scrub ok
Dec  6 04:42:50 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:50 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:50 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:50 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:50 np0005548918 ceph-mon[75798]: Regenerating cephadm self-signed grafana TLS certificates
Dec  6 04:42:50 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:50 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:50 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Dec  6 04:42:50 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:50 np0005548918 ceph-mon[75798]: Deploying daemon grafana.compute-0 on compute-0
Dec  6 04:42:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:50 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc001d70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e69 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:51 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Dec  6 04:42:51 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Dec  6 04:42:51 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Dec  6 04:42:51 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:51 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]: dispatch
Dec  6 04:42:51 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Dec  6 04:42:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:51 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:51 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:52 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.13 deep-scrub starts
Dec  6 04:42:52 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.13 deep-scrub ok
Dec  6 04:42:52 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]': finished
Dec  6 04:42:52 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Dec  6 04:42:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:52 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:53 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.c scrub starts
Dec  6 04:42:53 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.c scrub ok
Dec  6 04:42:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:53 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc000df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:53 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Dec  6 04:42:53 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 71 pg[10.14( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:53 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 71 pg[10.c( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:53 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 71 pg[10.4( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:53 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 71 pg[10.1c( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:53 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 71 pg[6.5( v 50'39 (0'0,50'39] local-lis/les=58/59 n=2 ec=54/21 lis/c=58/58 les/c/f=59/60/0 sis=71 pruub=15.599052429s) [1] r=-1 lpr=71 pi=[58,71)/1 crt=50'39 mlcod 50'39 active pruub 126.257926941s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:53 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 71 pg[6.5( v 50'39 (0'0,50'39] local-lis/les=58/59 n=2 ec=54/21 lis/c=58/58 les/c/f=59/60/0 sis=71 pruub=15.598986626s) [1] r=-1 lpr=71 pi=[58,71)/1 crt=50'39 mlcod 0'0 unknown NOTIFY pruub 126.257926941s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:53 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 71 pg[6.d( v 50'39 (0'0,50'39] local-lis/les=58/59 n=1 ec=54/21 lis/c=58/58 les/c/f=59/59/0 sis=71 pruub=15.598388672s) [1] r=-1 lpr=71 pi=[58,71)/1 crt=50'39 mlcod 50'39 active pruub 126.257263184s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:53 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 71 pg[6.d( v 50'39 (0'0,50'39] local-lis/les=58/59 n=1 ec=54/21 lis/c=58/58 les/c/f=59/59/0 sis=71 pruub=15.598182678s) [1] r=-1 lpr=71 pi=[58,71)/1 crt=50'39 mlcod 0'0 unknown NOTIFY pruub 126.257263184s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:53 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]: dispatch
Dec  6 04:42:53 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Dec  6 04:42:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:53 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:54 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.1c scrub starts
Dec  6 04:42:54 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.1c scrub ok
Dec  6 04:42:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:54 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:55 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.9 deep-scrub starts
Dec  6 04:42:55 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.9 deep-scrub ok
Dec  6 04:42:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Dec  6 04:42:55 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 72 pg[10.c( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=72) [2]/[1] r=-1 lpr=72 pi=[58,72)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:55 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 72 pg[10.c( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=72) [2]/[1] r=-1 lpr=72 pi=[58,72)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:55 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 72 pg[10.1c( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=72) [2]/[1] r=-1 lpr=72 pi=[58,72)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:55 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 72 pg[10.1c( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=72) [2]/[1] r=-1 lpr=72 pi=[58,72)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:55 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 72 pg[10.4( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=72) [2]/[1] r=-1 lpr=72 pi=[58,72)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:55 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 72 pg[10.4( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=72) [2]/[1] r=-1 lpr=72 pi=[58,72)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:55 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 72 pg[10.14( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=72) [2]/[1] r=-1 lpr=72 pi=[58,72)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:55 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 72 pg[10.14( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=72) [2]/[1] r=-1 lpr=72 pi=[58,72)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:42:55 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]': finished
Dec  6 04:42:55 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Dec  6 04:42:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:55 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:42:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:55 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc000df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:56 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Dec  6 04:42:56 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Dec  6 04:42:56 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Dec  6 04:42:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:56 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:57 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Dec  6 04:42:57 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Dec  6 04:42:57 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Dec  6 04:42:57 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 74 pg[10.c( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=72/58 les/c/f=73/59/0 sis=74) [2] r=0 lpr=74 pi=[58,74)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:57 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 74 pg[10.c( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=72/58 les/c/f=73/59/0 sis=74) [2] r=0 lpr=74 pi=[58,74)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:57 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 74 pg[10.1c( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=72/58 les/c/f=73/59/0 sis=74) [2] r=0 lpr=74 pi=[58,74)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:57 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 74 pg[10.1c( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=72/58 les/c/f=73/59/0 sis=74) [2] r=0 lpr=74 pi=[58,74)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:57 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 74 pg[10.4( v 73'1035 (0'0,73'1035] local-lis/les=0/0 n=6 ec=58/45 lis/c=72/58 les/c/f=73/59/0 sis=74) [2] r=0 lpr=74 pi=[58,74)/1 luod=0'0 crt=66'1034 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:57 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 74 pg[10.4( v 73'1035 (0'0,73'1035] local-lis/les=0/0 n=6 ec=58/45 lis/c=72/58 les/c/f=73/59/0 sis=74) [2] r=0 lpr=74 pi=[58,74)/1 crt=66'1034 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:57 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 74 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=72/58 les/c/f=73/59/0 sis=74) [2] r=0 lpr=74 pi=[58,74)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:42:57 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 74 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=72/58 les/c/f=73/59/0 sis=74) [2] r=0 lpr=74 pi=[58,74)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:42:57 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:57 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:57 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:57 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:57 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:42:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:57 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:57 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:58 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Dec  6 04:42:58 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Dec  6 04:42:58 np0005548918 ceph-mon[75798]: Deploying daemon haproxy.rgw.default.compute-0.vhqyer on compute-0
Dec  6 04:42:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Dec  6 04:42:58 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 75 pg[10.c( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=6 ec=58/45 lis/c=72/58 les/c/f=73/59/0 sis=74) [2] r=0 lpr=74 pi=[58,74)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:58 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 75 pg[10.1c( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=5 ec=58/45 lis/c=72/58 les/c/f=73/59/0 sis=74) [2] r=0 lpr=74 pi=[58,74)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:58 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 75 pg[10.4( v 73'1035 (0'0,73'1035] local-lis/les=74/75 n=6 ec=58/45 lis/c=72/58 les/c/f=73/59/0 sis=74) [2] r=0 lpr=74 pi=[58,74)/1 crt=73'1035 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:58 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 75 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=5 ec=58/45 lis/c=72/58 les/c/f=73/59/0 sis=74) [2] r=0 lpr=74 pi=[58,74)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:42:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:58 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc000df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:59 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Dec  6 04:42:59 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Dec  6 04:42:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:59 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:59 np0005548918 podman[85548]: 2025-12-06 09:42:59.889071422 +0000 UTC m=+0.044957184 container create e8ce464e9bcea9c52adb2119f709478f63cf79ff6cdd23c7b31df4591dcf7636 (image=quay.io/ceph/haproxy:2.3, name=amazing_sanderson)
Dec  6 04:42:59 np0005548918 systemd[1]: Started libpod-conmon-e8ce464e9bcea9c52adb2119f709478f63cf79ff6cdd23c7b31df4591dcf7636.scope.
Dec  6 04:42:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:42:59 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:42:59 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:42:59 np0005548918 podman[85548]: 2025-12-06 09:42:59.867786613 +0000 UTC m=+0.023672395 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec  6 04:42:59 np0005548918 podman[85548]: 2025-12-06 09:42:59.971177209 +0000 UTC m=+0.127062981 container init e8ce464e9bcea9c52adb2119f709478f63cf79ff6cdd23c7b31df4591dcf7636 (image=quay.io/ceph/haproxy:2.3, name=amazing_sanderson)
Dec  6 04:42:59 np0005548918 podman[85548]: 2025-12-06 09:42:59.983334975 +0000 UTC m=+0.139220737 container start e8ce464e9bcea9c52adb2119f709478f63cf79ff6cdd23c7b31df4591dcf7636 (image=quay.io/ceph/haproxy:2.3, name=amazing_sanderson)
Dec  6 04:42:59 np0005548918 podman[85548]: 2025-12-06 09:42:59.986579712 +0000 UTC m=+0.142465494 container attach e8ce464e9bcea9c52adb2119f709478f63cf79ff6cdd23c7b31df4591dcf7636 (image=quay.io/ceph/haproxy:2.3, name=amazing_sanderson)
Dec  6 04:42:59 np0005548918 amazing_sanderson[85564]: 0 0
Dec  6 04:42:59 np0005548918 systemd[1]: libpod-e8ce464e9bcea9c52adb2119f709478f63cf79ff6cdd23c7b31df4591dcf7636.scope: Deactivated successfully.
Dec  6 04:42:59 np0005548918 conmon[85564]: conmon e8ce464e9bcea9c52adb <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e8ce464e9bcea9c52adb2119f709478f63cf79ff6cdd23c7b31df4591dcf7636.scope/container/memory.events
Dec  6 04:42:59 np0005548918 podman[85548]: 2025-12-06 09:42:59.990493716 +0000 UTC m=+0.146379508 container died e8ce464e9bcea9c52adb2119f709478f63cf79ff6cdd23c7b31df4591dcf7636 (image=quay.io/ceph/haproxy:2.3, name=amazing_sanderson)
Dec  6 04:43:00 np0005548918 systemd[1]: var-lib-containers-storage-overlay-6d4998b158678933436578b7326c7646f22ad49f69d5af5b074472ccff5da8b6-merged.mount: Deactivated successfully.
Dec  6 04:43:00 np0005548918 podman[85548]: 2025-12-06 09:43:00.025903644 +0000 UTC m=+0.181789406 container remove e8ce464e9bcea9c52adb2119f709478f63cf79ff6cdd23c7b31df4591dcf7636 (image=quay.io/ceph/haproxy:2.3, name=amazing_sanderson)
Dec  6 04:43:00 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.b scrub starts
Dec  6 04:43:00 np0005548918 systemd[1]: libpod-conmon-e8ce464e9bcea9c52adb2119f709478f63cf79ff6cdd23c7b31df4591dcf7636.scope: Deactivated successfully.
Dec  6 04:43:00 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.b scrub ok
Dec  6 04:43:00 np0005548918 systemd[1]: Reloading.
Dec  6 04:43:00 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:43:00 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:43:00 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:00 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:00 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:00 np0005548918 ceph-mon[75798]: Deploying daemon haproxy.rgw.default.compute-2.mwbfro on compute-2
Dec  6 04:43:00 np0005548918 systemd[1]: Reloading.
Dec  6 04:43:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:00 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:00 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:43:00 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:43:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.002000053s ======
Dec  6 04:43:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:00.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Dec  6 04:43:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:00 np0005548918 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.mwbfro for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:43:01 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.11 scrub starts
Dec  6 04:43:01 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.11 scrub ok
Dec  6 04:43:01 np0005548918 podman[85713]: 2025-12-06 09:43:01.008212277 +0000 UTC m=+0.025948456 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec  6 04:43:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:01 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc000df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:01 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:02 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.a scrub starts
Dec  6 04:43:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:02 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:02.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:02 np0005548918 podman[85713]: 2025-12-06 09:43:02.966866563 +0000 UTC m=+1.984602772 container create b8d9570c47bd3592fbd103f6b0a3ee54d799a0a9f6736a755694d20150d3cb7c (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-rgw-default-compute-2-mwbfro)
Dec  6 04:43:02 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.a scrub ok
Dec  6 04:43:02 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:02 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]: dispatch
Dec  6 04:43:02 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Dec  6 04:43:03 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Dec  6 04:43:03 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 76 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76 pruub=14.227431297s) [0] r=-1 lpr=76 pi=[65,76)/1 crt=51'1027 mlcod 0'0 active pruub 134.490539551s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:03 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 76 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76 pruub=14.227377892s) [0] r=-1 lpr=76 pi=[65,76)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 134.490539551s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:03 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 76 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76 pruub=14.226625443s) [0] r=-1 lpr=76 pi=[65,76)/1 crt=51'1027 mlcod 0'0 active pruub 134.490554810s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:03 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 76 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76 pruub=14.226483345s) [0] r=-1 lpr=76 pi=[65,76)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 134.490554810s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:03 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 76 pg[10.5( v 66'1039 (0'0,66'1039] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76 pruub=14.226249695s) [0] r=-1 lpr=76 pi=[65,76)/1 crt=66'1036 lcod 66'1038 mlcod 66'1038 active pruub 134.490722656s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:03 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 76 pg[10.5( v 66'1039 (0'0,66'1039] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76 pruub=14.226179123s) [0] r=-1 lpr=76 pi=[65,76)/1 crt=66'1036 lcod 66'1038 mlcod 0'0 unknown NOTIFY pruub 134.490722656s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:03 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 76 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76 pruub=14.226050377s) [0] r=-1 lpr=76 pi=[65,76)/1 crt=51'1027 mlcod 0'0 active pruub 134.490829468s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:03 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 76 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76 pruub=14.226005554s) [0] r=-1 lpr=76 pi=[65,76)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 134.490829468s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:03 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f8454f3fcc72ff5e8c41783c27f8c7e712d963a67a666869183727484c7bf0ff/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Dec  6 04:43:03 np0005548918 podman[85713]: 2025-12-06 09:43:03.038213712 +0000 UTC m=+2.055949891 container init b8d9570c47bd3592fbd103f6b0a3ee54d799a0a9f6736a755694d20150d3cb7c (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-rgw-default-compute-2-mwbfro)
Dec  6 04:43:03 np0005548918 podman[85713]: 2025-12-06 09:43:03.043888074 +0000 UTC m=+2.061624233 container start b8d9570c47bd3592fbd103f6b0a3ee54d799a0a9f6736a755694d20150d3cb7c (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-rgw-default-compute-2-mwbfro)
Dec  6 04:43:03 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 7.5 deep-scrub starts
Dec  6 04:43:03 np0005548918 bash[85713]: b8d9570c47bd3592fbd103f6b0a3ee54d799a0a9f6736a755694d20150d3cb7c
Dec  6 04:43:03 np0005548918 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.mwbfro for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:43:03 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 7.5 deep-scrub ok
Dec  6 04:43:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-rgw-default-compute-2-mwbfro[85729]: [NOTICE] 339/094303 (2) : New worker #1 (4) forked
Dec  6 04:43:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:03 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:03 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc0095a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:04 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Dec  6 04:43:04 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Dec  6 04:43:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:04.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:04 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:04.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:04 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.5 scrub starts
Dec  6 04:43:05 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.5 scrub ok
Dec  6 04:43:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:05 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e76 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:05 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]': finished
Dec  6 04:43:05 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Dec  6 04:43:05 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]: dispatch
Dec  6 04:43:05 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Dec  6 04:43:05 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:05 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:05 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:05 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:05 np0005548918 ceph-mon[75798]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  6 04:43:05 np0005548918 ceph-mon[75798]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  6 04:43:05 np0005548918 ceph-mon[75798]: Deploying daemon keepalived.rgw.default.compute-0.mycoxk on compute-0
Dec  6 04:43:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Dec  6 04:43:05 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 77 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=0 lpr=77 pi=[65,77)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:05 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 77 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=0 lpr=77 pi=[65,77)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:05 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 77 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=0 lpr=77 pi=[65,77)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:05 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 77 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=0 lpr=77 pi=[65,77)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:05 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 77 pg[10.5( v 66'1039 (0'0,66'1039] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=0 lpr=77 pi=[65,77)/1 crt=66'1036 lcod 66'1038 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:05 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 77 pg[10.5( v 66'1039 (0'0,66'1039] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=0 lpr=77 pi=[65,77)/1 crt=66'1036 lcod 66'1038 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:05 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 77 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=0 lpr=77 pi=[65,77)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:05 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:05 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 77 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=0 lpr=77 pi=[65,77)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:05 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.b scrub starts
Dec  6 04:43:05 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.b scrub ok
Dec  6 04:43:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:06.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:06 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:43:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:06.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:43:06 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]': finished
Dec  6 04:43:06 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Dec  6 04:43:06 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Dec  6 04:43:06 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Dec  6 04:43:06 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 78 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=77/78 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] async=[0] r=0 lpr=77 pi=[65,77)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:06 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 78 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=77/78 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] async=[0] r=0 lpr=77 pi=[65,77)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:06 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 78 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=77/78 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] async=[0] r=0 lpr=77 pi=[65,77)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:06 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 78 pg[10.5( v 66'1039 (0'0,66'1039] local-lis/les=77/78 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] async=[0] r=0 lpr=77 pi=[65,77)/1 crt=66'1039 lcod 66'1038 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:06 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Dec  6 04:43:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:07 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc0095a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:07 np0005548918 podman[85841]: 2025-12-06 09:43:07.901324481 +0000 UTC m=+0.062639286 container create 102a1d9d18503322d818acb448a1632dfd0ec5da29073c7f04a62a5351428979 (image=quay.io/ceph/keepalived:2.2.4, name=loving_bell, release=1793, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, architecture=x86_64, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, version=2.2.4, vcs-type=git, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  6 04:43:07 np0005548918 systemd[1]: Started libpod-conmon-102a1d9d18503322d818acb448a1632dfd0ec5da29073c7f04a62a5351428979.scope.
Dec  6 04:43:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:07 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:07 np0005548918 podman[85841]: 2025-12-06 09:43:07.87400306 +0000 UTC m=+0.035317855 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec  6 04:43:07 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:07 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:07 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:07 np0005548918 ceph-mon[75798]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  6 04:43:07 np0005548918 ceph-mon[75798]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  6 04:43:07 np0005548918 ceph-mon[75798]: Deploying daemon keepalived.rgw.default.compute-2.yurwwh on compute-2
Dec  6 04:43:07 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:43:07 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Dec  6 04:43:08 np0005548918 podman[85841]: 2025-12-06 09:43:07.999841177 +0000 UTC m=+0.161155972 container init 102a1d9d18503322d818acb448a1632dfd0ec5da29073c7f04a62a5351428979 (image=quay.io/ceph/keepalived:2.2.4, name=loving_bell, build-date=2023-02-22T09:23:20, architecture=x86_64, io.openshift.tags=Ceph keepalived, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, release=1793, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, name=keepalived, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9)
Dec  6 04:43:08 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 79 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=77/78 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79 pruub=14.983809471s) [0] async=[0] r=-1 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 51'1027 active pruub 140.237838745s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:08 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 79 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=77/78 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79 pruub=14.983670235s) [0] r=-1 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 140.237838745s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:08 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 79 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=77/78 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79 pruub=14.982343674s) [0] async=[0] r=-1 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 51'1027 active pruub 140.237762451s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:08 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 79 pg[10.5( v 78'1042 (0'0,78'1042] local-lis/les=77/78 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79 pruub=14.982428551s) [0] async=[0] r=-1 lpr=79 pi=[65,79)/1 crt=66'1039 lcod 78'1041 mlcod 78'1041 active pruub 140.237915039s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:08 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 79 pg[10.5( v 78'1042 (0'0,78'1042] local-lis/les=77/78 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79 pruub=14.982355118s) [0] r=-1 lpr=79 pi=[65,79)/1 crt=66'1039 lcod 78'1041 mlcod 0'0 unknown NOTIFY pruub 140.237915039s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:08 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 79 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=77/78 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79 pruub=14.977651596s) [0] async=[0] r=-1 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 51'1027 active pruub 140.233718872s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:08 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 79 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=77/78 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79 pruub=14.977616310s) [0] r=-1 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 140.233718872s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:08 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 79 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=77/78 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79 pruub=14.981857300s) [0] r=-1 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 140.237762451s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:08 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 7.a scrub starts
Dec  6 04:43:08 np0005548918 podman[85841]: 2025-12-06 09:43:08.013037711 +0000 UTC m=+0.174352486 container start 102a1d9d18503322d818acb448a1632dfd0ec5da29073c7f04a62a5351428979 (image=quay.io/ceph/keepalived:2.2.4, name=loving_bell, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1793, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, com.redhat.component=keepalived-container, version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64)
Dec  6 04:43:08 np0005548918 podman[85841]: 2025-12-06 09:43:08.016574755 +0000 UTC m=+0.177889550 container attach 102a1d9d18503322d818acb448a1632dfd0ec5da29073c7f04a62a5351428979 (image=quay.io/ceph/keepalived:2.2.4, name=loving_bell, description=keepalived for Ceph, vcs-type=git, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, name=keepalived, architecture=x86_64, com.redhat.component=keepalived-container)
Dec  6 04:43:08 np0005548918 loving_bell[85857]: 0 0
Dec  6 04:43:08 np0005548918 systemd[1]: libpod-102a1d9d18503322d818acb448a1632dfd0ec5da29073c7f04a62a5351428979.scope: Deactivated successfully.
Dec  6 04:43:08 np0005548918 podman[85841]: 2025-12-06 09:43:08.021859286 +0000 UTC m=+0.183174081 container died 102a1d9d18503322d818acb448a1632dfd0ec5da29073c7f04a62a5351428979 (image=quay.io/ceph/keepalived:2.2.4, name=loving_bell, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, distribution-scope=public, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, release=1793, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git)
Dec  6 04:43:08 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 7.a scrub ok
Dec  6 04:43:08 np0005548918 systemd[1]: var-lib-containers-storage-overlay-7851b9f7245c88bb8f1e90c14c6e7ab9ddf35047878075b09ee7a306e87109a0-merged.mount: Deactivated successfully.
Dec  6 04:43:08 np0005548918 podman[85841]: 2025-12-06 09:43:08.066882711 +0000 UTC m=+0.228197526 container remove 102a1d9d18503322d818acb448a1632dfd0ec5da29073c7f04a62a5351428979 (image=quay.io/ceph/keepalived:2.2.4, name=loving_bell, io.openshift.expose-services=, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, name=keepalived, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, version=2.2.4, description=keepalived for Ceph)
Dec  6 04:43:08 np0005548918 systemd[1]: libpod-conmon-102a1d9d18503322d818acb448a1632dfd0ec5da29073c7f04a62a5351428979.scope: Deactivated successfully.
Dec  6 04:43:08 np0005548918 systemd[1]: Reloading.
Dec  6 04:43:08 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:43:08 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:43:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:43:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:08.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:43:08 np0005548918 systemd[1]: Reloading.
Dec  6 04:43:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:08 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:43:08 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:43:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:08.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:08 np0005548918 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.yurwwh for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:43:08 np0005548918 podman[86004]: 2025-12-06 09:43:08.965828784 +0000 UTC m=+0.042649672 container create 721a580e9e61a717818b669d502fe357ca1ff0f7feded812d9a82a7175fceded (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., release=1793, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, name=keepalived, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, vcs-type=git, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  6 04:43:09 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Dec  6 04:43:09 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2facdc6c952ecc28b858fa3236d52e956f844b41d5faab71a7a2977de9f3b855/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:43:09 np0005548918 podman[86004]: 2025-12-06 09:43:09.028883202 +0000 UTC m=+0.105704110 container init 721a580e9e61a717818b669d502fe357ca1ff0f7feded812d9a82a7175fceded (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, release=1793, io.openshift.expose-services=, vcs-type=git, build-date=2023-02-22T09:23:20, name=keepalived, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, architecture=x86_64, version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived)
Dec  6 04:43:09 np0005548918 podman[86004]: 2025-12-06 09:43:09.035157519 +0000 UTC m=+0.111978407 container start 721a580e9e61a717818b669d502fe357ca1ff0f7feded812d9a82a7175fceded (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, architecture=x86_64, vendor=Red Hat, Inc., release=1793, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20)
Dec  6 04:43:09 np0005548918 bash[86004]: 721a580e9e61a717818b669d502fe357ca1ff0f7feded812d9a82a7175fceded
Dec  6 04:43:09 np0005548918 podman[86004]: 2025-12-06 09:43:08.950215477 +0000 UTC m=+0.027036385 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec  6 04:43:09 np0005548918 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.yurwwh for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:43:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:09 2025: Starting Keepalived v2.2.4 (08/21,2021)
Dec  6 04:43:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:09 2025: Running on Linux 5.14.0-645.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025 (built for Linux 5.14.0)
Dec  6 04:43:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:09 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Dec  6 04:43:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:09 2025: Configuration file /etc/keepalived/keepalived.conf
Dec  6 04:43:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:09 2025: Failed to bind to process monitoring socket - errno 98 - Address already in use
Dec  6 04:43:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:09 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Dec  6 04:43:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:09 2025: Starting VRRP child process, pid=4
Dec  6 04:43:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:09 2025: Startup complete
Dec  6 04:43:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:09 2025: (VI_0) Entering BACKUP STATE (init)
Dec  6 04:43:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:09 2025: VRRP_Script(check_backend) succeeded
Dec  6 04:43:09 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.6 scrub starts
Dec  6 04:43:09 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.6 scrub ok
Dec  6 04:43:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:09 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003db0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:09 np0005548918 systemd-logind[800]: New session 36 of user zuul.
Dec  6 04:43:09 np0005548918 systemd[1]: Started Session 36 of User zuul.
Dec  6 04:43:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:09 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:10 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:10 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:10 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:10 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:10 np0005548918 ceph-mon[75798]: Deploying daemon prometheus.compute-0 on compute-0
Dec  6 04:43:10 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.8 deep-scrub starts
Dec  6 04:43:10 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.8 deep-scrub ok
Dec  6 04:43:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Dec  6 04:43:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:10.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:10 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:10.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:10 np0005548918 python3.9[86181]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:43:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:11 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.5 scrub starts
Dec  6 04:43:11 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.5 scrub ok
Dec  6 04:43:11 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:11 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]: dispatch
Dec  6 04:43:11 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Dec  6 04:43:11 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Dec  6 04:43:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 82 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=82 pruub=13.840667725s) [1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1027 mlcod 0'0 active pruub 142.486602783s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 82 pg[10.17( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=82 pruub=13.844517708s) [1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1027 mlcod 0'0 active pruub 142.490692139s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 82 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=82 pruub=13.840449333s) [1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 142.486602783s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 82 pg[10.17( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=82 pruub=13.844472885s) [1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 142.490692139s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 82 pg[10.f( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=82 pruub=13.843992233s) [1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1027 mlcod 0'0 active pruub 142.490783691s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 82 pg[10.f( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=82 pruub=13.843950272s) [1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 142.490783691s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 82 pg[10.7( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=82 pruub=13.843597412s) [1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1027 mlcod 0'0 active pruub 142.490768433s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:11 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 82 pg[10.7( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=82 pruub=13.843526840s) [1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 142.490768433s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:11 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:11 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003db0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:11 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Dec  6 04:43:12 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Dec  6 04:43:12 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]': finished
Dec  6 04:43:12 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Dec  6 04:43:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:12.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:12 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Dec  6 04:43:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 83 pg[10.17( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=83) [1]/[2] r=0 lpr=83 pi=[65,83)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 83 pg[10.f( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=83) [1]/[2] r=0 lpr=83 pi=[65,83)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 83 pg[10.17( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=83) [1]/[2] r=0 lpr=83 pi=[65,83)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 83 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=83) [1]/[2] r=0 lpr=83 pi=[65,83)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 83 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=83) [1]/[2] r=0 lpr=83 pi=[65,83)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 83 pg[10.f( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=83) [1]/[2] r=0 lpr=83 pi=[65,83)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 83 pg[10.7( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=83) [1]/[2] r=0 lpr=83 pi=[65,83)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:12 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 83 pg[10.7( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=83) [1]/[2] r=0 lpr=83 pi=[65,83)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:12 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:12 np0005548918 python3.9[86397]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:43:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:12.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:13 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.17 scrub starts
Dec  6 04:43:13 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.17 scrub ok
Dec  6 04:43:13 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]: dispatch
Dec  6 04:43:13 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Dec  6 04:43:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:13 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:13 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Dec  6 04:43:13 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 84 pg[6.9( v 50'39 (0'0,50'39] local-lis/les=58/59 n=0 ec=54/21 lis/c=58/58 les/c/f=59/59/0 sis=84 pruub=11.578199387s) [0] r=-1 lpr=84 pi=[58,84)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 142.258438110s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:13 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 84 pg[6.9( v 50'39 (0'0,50'39] local-lis/les=58/59 n=0 ec=54/21 lis/c=58/58 les/c/f=59/59/0 sis=84 pruub=11.578167915s) [0] r=-1 lpr=84 pi=[58,84)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 142.258438110s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:13 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:14 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Dec  6 04:43:14 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Dec  6 04:43:14 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 84 pg[10.7( v 51'1027 (0'0,51'1027] local-lis/les=83/84 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=83) [1]/[2] async=[1] r=0 lpr=83 pi=[65,83)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:14 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 84 pg[10.17( v 51'1027 (0'0,51'1027] local-lis/les=83/84 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=83) [1]/[2] async=[1] r=0 lpr=83 pi=[65,83)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:14 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 84 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=83/84 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=83) [1]/[2] async=[1] r=0 lpr=83 pi=[65,83)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:14 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 84 pg[10.f( v 51'1027 (0'0,51'1027] local-lis/les=83/84 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=83) [1]/[2] async=[1] r=0 lpr=83 pi=[65,83)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:14.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]': finished
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Dec  6 04:43:14 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 85 pg[10.17( v 51'1027 (0'0,51'1027] local-lis/les=83/84 n=5 ec=58/45 lis/c=83/65 les/c/f=84/66/0 sis=85 pruub=15.581374168s) [1] async=[1] r=-1 lpr=85 pi=[65,85)/1 crt=51'1027 mlcod 51'1027 active pruub 147.343536377s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:14 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 85 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=83/84 n=5 ec=58/45 lis/c=83/65 les/c/f=84/66/0 sis=85 pruub=15.581397057s) [1] async=[1] r=-1 lpr=85 pi=[65,85)/1 crt=51'1027 mlcod 51'1027 active pruub 147.343582153s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:14 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 85 pg[10.17( v 51'1027 (0'0,51'1027] local-lis/les=83/84 n=5 ec=58/45 lis/c=83/65 les/c/f=84/66/0 sis=85 pruub=15.581324577s) [1] r=-1 lpr=85 pi=[65,85)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 147.343536377s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:14 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 85 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=83/84 n=5 ec=58/45 lis/c=83/65 les/c/f=84/66/0 sis=85 pruub=15.581362724s) [1] r=-1 lpr=85 pi=[65,85)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 147.343582153s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:14.524163) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194524318, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7405, "num_deletes": 256, "total_data_size": 21180002, "memory_usage": 22477040, "flush_reason": "Manual Compaction"}
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Dec  6 04:43:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:14 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003db0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:14.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194688431, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 13003706, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 249, "largest_seqno": 7410, "table_properties": {"data_size": 12975238, "index_size": 18113, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9285, "raw_key_size": 90255, "raw_average_key_size": 24, "raw_value_size": 12903944, "raw_average_value_size": 3480, "num_data_blocks": 803, "num_entries": 3708, "num_filter_entries": 3708, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014010, "oldest_key_time": 1765014010, "file_creation_time": 1765014194, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 164293 microseconds, and 30597 cpu microseconds.
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:14.688488) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 13003706 bytes OK
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:14.688506) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:14.691345) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:14.691360) EVENT_LOG_v1 {"time_micros": 1765014194691356, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:14.691439) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 21139983, prev total WAL file size 21139983, number of live WAL files 2.
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:14.694624) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(12MB) 8(1648B)]
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194694736, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 13005354, "oldest_snapshot_seqno": -1}
Dec  6 04:43:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3455 keys, 12999849 bytes, temperature: kUnknown
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194891535, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 12999849, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12972098, "index_size": 18041, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8645, "raw_key_size": 86039, "raw_average_key_size": 24, "raw_value_size": 12904002, "raw_average_value_size": 3734, "num_data_blocks": 801, "num_entries": 3455, "num_filter_entries": 3455, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765014194, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:14.891787) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 12999849 bytes
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:14.895872) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 66.1 rd, 66.0 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(12.4, 0.0 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3713, records dropped: 258 output_compression: NoCompression
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:14.895918) EVENT_LOG_v1 {"time_micros": 1765014194895901, "job": 4, "event": "compaction_finished", "compaction_time_micros": 196878, "compaction_time_cpu_micros": 22390, "output_level": 6, "num_output_files": 1, "total_output_size": 12999849, "num_input_records": 3713, "num_output_records": 3455, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194899316, "job": 4, "event": "table_file_deletion", "file_number": 14}
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194899385, "job": 4, "event": "table_file_deletion", "file_number": 8}
Dec  6 04:43:14 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:14.694480) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:43:15 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Dec  6 04:43:15 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Dec  6 04:43:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:15 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Dec  6 04:43:15 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 86 pg[10.f( v 51'1027 (0'0,51'1027] local-lis/les=83/84 n=6 ec=58/45 lis/c=83/65 les/c/f=84/66/0 sis=86 pruub=14.569250107s) [1] async=[1] r=-1 lpr=86 pi=[65,86)/1 crt=51'1027 mlcod 51'1027 active pruub 147.343566895s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:15 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 86 pg[10.f( v 51'1027 (0'0,51'1027] local-lis/les=83/84 n=6 ec=58/45 lis/c=83/65 les/c/f=84/66/0 sis=86 pruub=14.569159508s) [1] r=-1 lpr=86 pi=[65,86)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 147.343566895s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:15 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 86 pg[10.9( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=86 pruub=9.716260910s) [1] r=-1 lpr=86 pi=[65,86)/1 crt=51'1027 mlcod 0'0 active pruub 142.490753174s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:15 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 86 pg[10.9( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=86 pruub=9.716148376s) [1] r=-1 lpr=86 pi=[65,86)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 142.490753174s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:15 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 86 pg[10.7( v 51'1027 (0'0,51'1027] local-lis/les=83/84 n=6 ec=58/45 lis/c=83/65 les/c/f=84/66/0 sis=86 pruub=14.568492889s) [1] async=[1] r=-1 lpr=86 pi=[65,86)/1 crt=51'1027 mlcod 51'1027 active pruub 147.343521118s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:15 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 86 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=86 pruub=9.715703964s) [1] r=-1 lpr=86 pi=[65,86)/1 crt=51'1027 mlcod 0'0 active pruub 142.490997314s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:15 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 86 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=86 pruub=9.715685844s) [1] r=-1 lpr=86 pi=[65,86)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 142.490997314s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:15 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 86 pg[10.7( v 51'1027 (0'0,51'1027] local-lis/les=83/84 n=6 ec=58/45 lis/c=83/65 les/c/f=84/66/0 sis=86 pruub=14.568037033s) [1] r=-1 lpr=86 pi=[65,86)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 147.343521118s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:15 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]: dispatch
Dec  6 04:43:15 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Dec  6 04:43:15 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:15 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:15 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:15 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Dec  6 04:43:15 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:15 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Dec  6 04:43:16 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 87 pg[10.9( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=87) [1]/[2] r=0 lpr=87 pi=[65,87)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:16 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 87 pg[10.9( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=87) [1]/[2] r=0 lpr=87 pi=[65,87)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:16 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 87 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=87) [1]/[2] r=0 lpr=87 pi=[65,87)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:16 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 87 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=87) [1]/[2] r=0 lpr=87 pi=[65,87)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.020479) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196020525, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 342, "num_deletes": 253, "total_data_size": 249897, "memory_usage": 257848, "flush_reason": "Manual Compaction"}
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196026427, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 165706, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7415, "largest_seqno": 7752, "table_properties": {"data_size": 163515, "index_size": 355, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 4664, "raw_average_key_size": 15, "raw_value_size": 159093, "raw_average_value_size": 526, "num_data_blocks": 16, "num_entries": 302, "num_filter_entries": 302, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014195, "oldest_key_time": 1765014195, "file_creation_time": 1765014196, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 5989 microseconds, and 1414 cpu microseconds.
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.026470) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 165706 bytes OK
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.026489) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.027747) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.027761) EVENT_LOG_v1 {"time_micros": 1765014196027757, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.027776) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 247490, prev total WAL file size 247490, number of live WAL files 2.
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.028547) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323534' seq:0, type:0; will stop at (end)
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(161KB)], [15(12MB)]
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196028584, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 13165555, "oldest_snapshot_seqno": -1}
Dec  6 04:43:16 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.7 scrub starts
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3235 keys, 12743143 bytes, temperature: kUnknown
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196158908, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 12743143, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12716720, "index_size": 17225, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8133, "raw_key_size": 83319, "raw_average_key_size": 25, "raw_value_size": 12652301, "raw_average_value_size": 3911, "num_data_blocks": 748, "num_entries": 3235, "num_filter_entries": 3235, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765014196, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.159143) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 12743143 bytes
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.163658) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 101.0 rd, 97.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 12.4 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(156.4) write-amplify(76.9) OK, records in: 3757, records dropped: 522 output_compression: NoCompression
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.163684) EVENT_LOG_v1 {"time_micros": 1765014196163673, "job": 6, "event": "compaction_finished", "compaction_time_micros": 130392, "compaction_time_cpu_micros": 29210, "output_level": 6, "num_output_files": 1, "total_output_size": 12743143, "num_input_records": 3757, "num_output_records": 3235, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196163824, "job": 6, "event": "table_file_deletion", "file_number": 17}
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196166166, "job": 6, "event": "table_file_deletion", "file_number": 15}
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.028429) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.166428) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.166498) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.166559) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.166619) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:43:16.166679) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:43:16 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 9.7 scrub ok
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr respawn  1: '-n'
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr respawn  2: 'mgr.compute-2.oazbvn'
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr respawn  3: '-f'
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr respawn  4: '--setuser'
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr respawn  5: 'ceph'
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr respawn  6: '--setgroup'
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr respawn  7: 'ceph'
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr respawn  8: '--default-log-to-file=false'
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr respawn  9: '--default-log-to-journald=true'
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr respawn  10: '--default-log-to-stderr=false'
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr respawn  exe_path /proc/self/exe
Dec  6 04:43:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: ignoring --setuser ceph since I am not root
Dec  6 04:43:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: ignoring --setgroup ceph since I am not root
Dec  6 04:43:16 np0005548918 systemd[1]: session-34.scope: Deactivated successfully.
Dec  6 04:43:16 np0005548918 systemd[1]: session-34.scope: Consumed 22.363s CPU time.
Dec  6 04:43:16 np0005548918 systemd-logind[800]: Session 34 logged out. Waiting for processes to exit.
Dec  6 04:43:16 np0005548918 systemd-logind[800]: Removed session 34.
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: pidfile_write: ignore empty --pid-file
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'alerts'
Dec  6 04:43:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:16.399+0000 7f75dcc0e140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'balancer'
Dec  6 04:43:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:16.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:16.480+0000 7f75dcc0e140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  6 04:43:16 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'cephadm'
Dec  6 04:43:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:16 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]': finished
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Dec  6 04:43:16 np0005548918 ceph-mon[75798]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Dec  6 04:43:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:16.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:17 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.2 scrub starts
Dec  6 04:43:17 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 8.2 scrub ok
Dec  6 04:43:17 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'crash'
Dec  6 04:43:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:17.298+0000 7f75dcc0e140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:43:17 np0005548918 ceph-mgr[76108]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  6 04:43:17 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'dashboard'
Dec  6 04:43:17 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Dec  6 04:43:17 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 88 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=87/88 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[65,87)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:17 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 88 pg[10.9( v 51'1027 (0'0,51'1027] local-lis/les=87/88 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[65,87)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:17 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003db0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:17 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'devicehealth'
Dec  6 04:43:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:17 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:17.967+0000 7f75dcc0e140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:43:17 np0005548918 ceph-mgr[76108]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  6 04:43:17 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'diskprediction_local'
Dec  6 04:43:18 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Dec  6 04:43:18 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Dec  6 04:43:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  6 04:43:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  6 04:43:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]:  from numpy import show_config as show_numpy_config
Dec  6 04:43:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:18.133+0000 7f75dcc0e140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:43:18 np0005548918 ceph-mgr[76108]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  6 04:43:18 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'influx'
Dec  6 04:43:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:18.210+0000 7f75dcc0e140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:43:18 np0005548918 ceph-mgr[76108]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  6 04:43:18 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'insights'
Dec  6 04:43:18 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Dec  6 04:43:18 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 89 pg[10.9( v 51'1027 (0'0,51'1027] local-lis/les=87/88 n=6 ec=58/45 lis/c=87/65 les/c/f=88/66/0 sis=89 pruub=15.007217407s) [1] async=[1] r=-1 lpr=89 pi=[65,89)/1 crt=51'1027 mlcod 51'1027 active pruub 150.589691162s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:18 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 89 pg[10.9( v 51'1027 (0'0,51'1027] local-lis/les=87/88 n=6 ec=58/45 lis/c=87/65 les/c/f=88/66/0 sis=89 pruub=15.007141113s) [1] r=-1 lpr=89 pi=[65,89)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 150.589691162s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:18 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 89 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=87/88 n=5 ec=58/45 lis/c=87/65 les/c/f=88/66/0 sis=89 pruub=15.001312256s) [1] async=[1] r=-1 lpr=89 pi=[65,89)/1 crt=51'1027 mlcod 51'1027 active pruub 150.584732056s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:18 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 89 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=87/88 n=5 ec=58/45 lis/c=87/65 les/c/f=88/66/0 sis=89 pruub=15.001262665s) [1] r=-1 lpr=89 pi=[65,89)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 150.584732056s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:18 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'iostat'
Dec  6 04:43:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:18.402+0000 7f75dcc0e140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:43:18 np0005548918 ceph-mgr[76108]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  6 04:43:18 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'k8sevents'
Dec  6 04:43:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:18.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:18 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:18.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:18 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'localpool'
Dec  6 04:43:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:18 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'mds_autoscaler'
Dec  6 04:43:18 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Dec  6 04:43:19 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Dec  6 04:43:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'mirroring'
Dec  6 04:43:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'nfs'
Dec  6 04:43:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:19.391+0000 7f75dcc0e140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:43:19 np0005548918 ceph-mgr[76108]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  6 04:43:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'orchestrator'
Dec  6 04:43:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:19 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ccc0021b0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:19.624+0000 7f75dcc0e140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:43:19 np0005548918 ceph-mgr[76108]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  6 04:43:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'osd_perf_query'
Dec  6 04:43:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:19.705+0000 7f75dcc0e140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:43:19 np0005548918 ceph-mgr[76108]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  6 04:43:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'osd_support'
Dec  6 04:43:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:19.771+0000 7f75dcc0e140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:43:19 np0005548918 ceph-mgr[76108]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  6 04:43:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'pg_autoscaler'
Dec  6 04:43:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:19.847+0000 7f75dcc0e140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:43:19 np0005548918 ceph-mgr[76108]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  6 04:43:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'progress'
Dec  6 04:43:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:19.916+0000 7f75dcc0e140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:43:19 np0005548918 ceph-mgr[76108]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  6 04:43:19 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'prometheus'
Dec  6 04:43:19 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.a scrub starts
Dec  6 04:43:19 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.a scrub ok
Dec  6 04:43:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:19 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003db0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:20.280+0000 7f75dcc0e140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548918 ceph-mgr[76108]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rbd_support'
Dec  6 04:43:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Dec  6 04:43:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:20.388+0000 7f75dcc0e140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548918 ceph-mgr[76108]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'restful'
Dec  6 04:43:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:20.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:20 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:20 np0005548918 systemd[1]: session-36.scope: Deactivated successfully.
Dec  6 04:43:20 np0005548918 systemd[1]: session-36.scope: Consumed 8.062s CPU time.
Dec  6 04:43:20 np0005548918 systemd-logind[800]: Session 36 logged out. Waiting for processes to exit.
Dec  6 04:43:20 np0005548918 systemd-logind[800]: Removed session 36.
Dec  6 04:43:20 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rgw'
Dec  6 04:43:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:20.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:20.824+0000 7f75dcc0e140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548918 ceph-mgr[76108]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  6 04:43:20 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'rook'
Dec  6 04:43:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:20 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.16 scrub starts
Dec  6 04:43:20 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.16 scrub ok
Dec  6 04:43:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:21 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8001ff0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:21.409+0000 7f75dcc0e140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548918 ceph-mgr[76108]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'selftest'
Dec  6 04:43:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:21.486+0000 7f75dcc0e140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548918 ceph-mgr[76108]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'snap_schedule'
Dec  6 04:43:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:21.580+0000 7f75dcc0e140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548918 ceph-mgr[76108]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'stats'
Dec  6 04:43:21 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'status'
Dec  6 04:43:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:21.743+0000 7f75dcc0e140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548918 ceph-mgr[76108]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'telegraf'
Dec  6 04:43:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:21.815+0000 7f75dcc0e140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548918 ceph-mgr[76108]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'telemetry'
Dec  6 04:43:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:21 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.17 scrub starts
Dec  6 04:43:21 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.17 scrub ok
Dec  6 04:43:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:21 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cac000b60 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:21.973+0000 7f75dcc0e140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548918 ceph-mgr[76108]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  6 04:43:21 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'test_orchestrator'
Dec  6 04:43:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:22.185+0000 7f75dcc0e140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'volumes'
Dec  6 04:43:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:43:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:22.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:43:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:22.442+0000 7f75dcc0e140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: mgr[py] Loading python module 'zabbix'
Dec  6 04:43:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 2025-12-06T09:43:22.510+0000 7f75dcc0e140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: mgr load Constructed class from module: dashboard
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: mgr load Constructed class from module: prometheus
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: [dashboard INFO root] Starting engine...
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: [prometheus INFO root] server_addr: :: server_port: 9283
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: [prometheus INFO root] Starting engine...
Dec  6 04:43:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: [06/Dec/2025:09:43:22] ENGINE Bus STARTING
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: [prometheus INFO cherrypy.error] [06/Dec/2025:09:43:22] ENGINE Bus STARTING
Dec  6 04:43:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: CherryPy Checker:
Dec  6 04:43:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: The Application mounted at '' has an empty config.
Dec  6 04:43:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: 
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: ms_deliver_dispatch: unhandled message 0x56115089f860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Dec  6 04:43:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:22 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003d10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: [dashboard INFO root] Engine started...
Dec  6 04:43:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:22.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: [06/Dec/2025:09:43:22] ENGINE Serving on http://:::9283
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: [prometheus INFO cherrypy.error] [06/Dec/2025:09:43:22] ENGINE Serving on http://:::9283
Dec  6 04:43:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-2-oazbvn[76104]: [06/Dec/2025:09:43:22] ENGINE Bus STARTED
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: [prometheus INFO cherrypy.error] [06/Dec/2025:09:43:22] ENGINE Bus STARTED
Dec  6 04:43:22 np0005548918 ceph-mgr[76108]: [prometheus INFO root] Engine started.
Dec  6 04:43:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:22 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.9 scrub starts
Dec  6 04:43:22 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.9 scrub ok
Dec  6 04:43:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:23 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:23 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Dec  6 04:43:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:23 np0005548918 ceph-mon[75798]: Active manager daemon compute-0.qhdjwa restarted
Dec  6 04:43:23 np0005548918 ceph-mon[75798]: Activating manager daemon compute-0.qhdjwa
Dec  6 04:43:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:23 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8001ff0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:23 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.7 deep-scrub starts
Dec  6 04:43:24 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.7 deep-scrub ok
Dec  6 04:43:24 np0005548918 systemd-logind[800]: New session 37 of user ceph-admin.
Dec  6 04:43:24 np0005548918 systemd[1]: Started Session 37 of User ceph-admin.
Dec  6 04:43:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:24.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:24 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cac0016a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:24.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:24 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.11 scrub starts
Dec  6 04:43:24 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.11 scrub ok
Dec  6 04:43:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:25 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003d30 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:25 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.1a deep-scrub starts
Dec  6 04:43:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:25 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:26.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:26 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8001ff0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:26.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:26 np0005548918 podman[86653]: 2025-12-06 09:43:26.694522581 +0000 UTC m=+1.771602091 container exec 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325)
Dec  6 04:43:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:27 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.1a deep-scrub ok
Dec  6 04:43:27 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.2 scrub starts
Dec  6 04:43:27 np0005548918 podman[86653]: 2025-12-06 09:43:27.283923909 +0000 UTC m=+2.361003439 container exec_died 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:43:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:27 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cac0016a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:27 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.2 scrub ok
Dec  6 04:43:27 np0005548918 ceph-mon[75798]: Manager daemon compute-0.qhdjwa is now available
Dec  6 04:43:27 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:27 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/mirror_snapshot_schedule"}]: dispatch
Dec  6 04:43:27 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/trash_purge_schedule"}]: dispatch
Dec  6 04:43:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:27 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cac0016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:28 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Dec  6 04:43:28 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Dec  6 04:43:28 np0005548918 podman[86775]: 2025-12-06 09:43:28.080469707 +0000 UTC m=+0.048021775 container exec 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:43:28 np0005548918 podman[86800]: 2025-12-06 09:43:28.141312301 +0000 UTC m=+0.048036704 container exec_died 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:43:28 np0005548918 podman[86775]: 2025-12-06 09:43:28.14644066 +0000 UTC m=+0.113992718 container exec_died 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:43:28 np0005548918 podman[86865]: 2025-12-06 09:43:28.414966778 +0000 UTC m=+0.053596854 container exec a3634fe4060dc94c2c20aff61ae4ab07f3ae7c7af9e41801a8e759fad2a4f938 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Dec  6 04:43:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:28.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:28 np0005548918 podman[86886]: 2025-12-06 09:43:28.48126674 +0000 UTC m=+0.049118803 container exec_died a3634fe4060dc94c2c20aff61ae4ab07f3ae7c7af9e41801a8e759fad2a4f938 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True)
Dec  6 04:43:28 np0005548918 podman[86865]: 2025-12-06 09:43:28.523032335 +0000 UTC m=+0.161662411 container exec_died a3634fe4060dc94c2c20aff61ae4ab07f3ae7c7af9e41801a8e759fad2a4f938 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:43:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:28 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:28.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:28 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:43:25] ENGINE Bus STARTING
Dec  6 04:43:28 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:43:25] ENGINE Serving on https://192.168.122.100:7150
Dec  6 04:43:28 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:43:25] ENGINE Client ('192.168.122.100', 44988) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  6 04:43:28 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:43:25] ENGINE Serving on http://192.168.122.100:8765
Dec  6 04:43:28 np0005548918 ceph-mon[75798]: [06/Dec/2025:09:43:25] ENGINE Bus STARTED
Dec  6 04:43:28 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:28 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:28 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]: dispatch
Dec  6 04:43:28 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Dec  6 04:43:28 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Dec  6 04:43:28 np0005548918 podman[86931]: 2025-12-06 09:43:28.752836904 +0000 UTC m=+0.057654896 container exec 291e33d7558df1250bc1d75586903aba6000ccad9dd3cb120f4999944db31c98 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna)
Dec  6 04:43:28 np0005548918 podman[86931]: 2025-12-06 09:43:28.757990693 +0000 UTC m=+0.062808685 container exec_died 291e33d7558df1250bc1d75586903aba6000ccad9dd3cb120f4999944db31c98 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna)
Dec  6 04:43:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:28 np0005548918 podman[86994]: 2025-12-06 09:43:28.931256914 +0000 UTC m=+0.039102631 container exec cbcabdb9b139bf7198b15438accb8f4a51fb667fdf4f19be3cdf7b28a8213220 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, version=2.2.4, release=1793, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, io.buildah.version=1.28.2, vcs-type=git, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  6 04:43:28 np0005548918 podman[86994]: 2025-12-06 09:43:28.947522512 +0000 UTC m=+0.055368119 container exec_died cbcabdb9b139bf7198b15438accb8f4a51fb667fdf4f19be3cdf7b28a8213220 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, vcs-type=git, version=2.2.4, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, name=keepalived)
Dec  6 04:43:28 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Dec  6 04:43:28 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Dec  6 04:43:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:29 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8001ff0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:29 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]': finished
Dec  6 04:43:29 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Dec  6 04:43:29 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:29 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:29 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  6 04:43:29 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Dec  6 04:43:29 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:29 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Dec  6 04:43:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:29 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8001ff0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:30 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.3 scrub starts
Dec  6 04:43:30 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.3 scrub ok
Dec  6 04:43:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:30.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:30 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8001ff0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:30.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Dec  6 04:43:30 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]: dispatch
Dec  6 04:43:30 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Dec  6 04:43:30 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:30 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:30 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  6 04:43:30 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:30 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:30 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 94 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=94 pruub=10.451518059s) [1] r=-1 lpr=94 pi=[65,94)/1 crt=51'1027 mlcod 0'0 active pruub 158.491195679s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:30 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 94 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=94 pruub=10.451478958s) [1] r=-1 lpr=94 pi=[65,94)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 158.491195679s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:30 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 94 pg[10.b( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=94 pruub=10.451292992s) [1] r=-1 lpr=94 pi=[65,94)/1 crt=51'1027 mlcod 0'0 active pruub 158.491210938s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:30 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 94 pg[10.b( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=94 pruub=10.451199532s) [1] r=-1 lpr=94 pi=[65,94)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 158.491210938s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:31 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.1d scrub starts
Dec  6 04:43:31 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.1d scrub ok
Dec  6 04:43:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:31 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]': finished
Dec  6 04:43:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Dec  6 04:43:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]: dispatch
Dec  6 04:43:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Dec  6 04:43:31 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Dec  6 04:43:31 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 95 pg[10.c( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=6 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=95 pruub=14.468092918s) [1] r=-1 lpr=95 pi=[74,95)/1 crt=51'1027 mlcod 0'0 active pruub 163.532089233s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:31 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 95 pg[10.c( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=6 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=95 pruub=14.468053818s) [1] r=-1 lpr=95 pi=[74,95)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 163.532089233s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:31 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 95 pg[10.1c( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=5 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=95 pruub=14.470390320s) [1] r=-1 lpr=95 pi=[74,95)/1 crt=51'1027 mlcod 0'0 active pruub 163.534591675s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:31 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 95 pg[10.1c( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=5 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=95 pruub=14.470355034s) [1] r=-1 lpr=95 pi=[74,95)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 163.534591675s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:31 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 95 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] r=0 lpr=95 pi=[65,95)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:31 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 95 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] r=0 lpr=95 pi=[65,95)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:31 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 95 pg[10.b( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] r=0 lpr=95 pi=[65,95)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:31 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 95 pg[10.b( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] r=0 lpr=95 pi=[65,95)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:31 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.4 deep-scrub starts
Dec  6 04:43:31 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.4 deep-scrub ok
Dec  6 04:43:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:31 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003f10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:32.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:32 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:32.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:32 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.1e scrub starts
Dec  6 04:43:32 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.1e scrub ok
Dec  6 04:43:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:33 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:33 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.13 scrub starts
Dec  6 04:43:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:33 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:34 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.13 scrub ok
Dec  6 04:43:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:34.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:34 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003f30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:34.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:34 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.18 scrub starts
Dec  6 04:43:35 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 12.18 scrub ok
Dec  6 04:43:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Dec  6 04:43:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 96 pg[10.c( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=6 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=96) [1]/[2] r=0 lpr=96 pi=[74,96)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 96 pg[10.c( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=6 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=96) [1]/[2] r=0 lpr=96 pi=[74,96)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 96 pg[10.1c( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=5 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=96) [1]/[2] r=0 lpr=96 pi=[74,96)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 96 pg[10.1c( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=5 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=96) [1]/[2] r=0 lpr=96 pi=[74,96)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]': finished
Dec  6 04:43:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Dec  6 04:43:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  6 04:43:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:43:35 np0005548918 ceph-mon[75798]: Updating compute-0:/etc/ceph/ceph.conf
Dec  6 04:43:35 np0005548918 ceph-mon[75798]: Updating compute-1:/etc/ceph/ceph.conf
Dec  6 04:43:35 np0005548918 ceph-mon[75798]: Updating compute-2:/etc/ceph/ceph.conf
Dec  6 04:43:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 96 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=95/96 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] async=[1] r=0 lpr=95 pi=[65,95)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:35 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 96 pg[10.b( v 51'1027 (0'0,51'1027] local-lis/les=95/96 n=6 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] async=[1] r=0 lpr=95 pi=[65,95)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:35 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8001ff0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:35 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:36 np0005548918 systemd-logind[800]: New session 38 of user zuul.
Dec  6 04:43:36 np0005548918 systemd[1]: Started Session 38 of User zuul.
Dec  6 04:43:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:36.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:36 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4001fe0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:43:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:36.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:43:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:36 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Dec  6 04:43:36 np0005548918 ceph-mon[75798]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:43:36 np0005548918 ceph-mon[75798]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:43:36 np0005548918 ceph-mon[75798]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec  6 04:43:36 np0005548918 ceph-mon[75798]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:43:36 np0005548918 ceph-mon[75798]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:43:36 np0005548918 ceph-mon[75798]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  6 04:43:36 np0005548918 ceph-mon[75798]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec  6 04:43:36 np0005548918 ceph-mon[75798]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec  6 04:43:36 np0005548918 ceph-mon[75798]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec  6 04:43:36 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:36 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:36 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:36 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:36 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 97 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=95/96 n=5 ec=58/45 lis/c=95/65 les/c/f=96/66/0 sis=97 pruub=14.460686684s) [1] async=[1] r=-1 lpr=97 pi=[65,97)/1 crt=51'1027 mlcod 51'1027 active pruub 168.610885620s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:36 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 97 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=95/96 n=5 ec=58/45 lis/c=95/65 les/c/f=96/66/0 sis=97 pruub=14.460103035s) [1] r=-1 lpr=97 pi=[65,97)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 168.610885620s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:36 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 97 pg[10.c( v 51'1027 (0'0,51'1027] local-lis/les=96/97 n=6 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=96) [1]/[2] async=[1] r=0 lpr=96 pi=[74,96)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:36 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 97 pg[10.1c( v 51'1027 (0'0,51'1027] local-lis/les=96/97 n=5 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=96) [1]/[2] async=[1] r=0 lpr=96 pi=[74,96)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:43:36 np0005548918 python3.9[88269]: ansible-ansible.legacy.ping Invoked with data=pong
Dec  6 04:43:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:37 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:37 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Dec  6 04:43:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 98 pg[10.c( v 51'1027 (0'0,51'1027] local-lis/les=96/97 n=6 ec=58/45 lis/c=96/74 les/c/f=97/75/0 sis=98 pruub=15.020521164s) [1] async=[1] r=-1 lpr=98 pi=[74,98)/1 crt=51'1027 mlcod 51'1027 active pruub 170.157180786s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 98 pg[10.c( v 51'1027 (0'0,51'1027] local-lis/les=96/97 n=6 ec=58/45 lis/c=96/74 les/c/f=97/75/0 sis=98 pruub=15.020419121s) [1] r=-1 lpr=98 pi=[74,98)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 170.157180786s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 98 pg[10.1c( v 51'1027 (0'0,51'1027] local-lis/les=96/97 n=5 ec=58/45 lis/c=96/74 les/c/f=97/75/0 sis=98 pruub=15.020762444s) [1] async=[1] r=-1 lpr=98 pi=[74,98)/1 crt=51'1027 mlcod 51'1027 active pruub 170.157806396s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 98 pg[10.1c( v 51'1027 (0'0,51'1027] local-lis/les=96/97 n=5 ec=58/45 lis/c=96/74 les/c/f=97/75/0 sis=98 pruub=15.020657539s) [1] r=-1 lpr=98 pi=[74,98)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 170.157806396s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 98 pg[10.b( v 51'1027 (0'0,51'1027] local-lis/les=95/96 n=6 ec=58/45 lis/c=95/65 les/c/f=96/66/0 sis=98 pruub=13.473182678s) [1] async=[1] r=-1 lpr=98 pi=[65,98)/1 crt=51'1027 mlcod 51'1027 active pruub 168.610931396s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:37 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 98 pg[10.b( v 51'1027 (0'0,51'1027] local-lis/les=95/96 n=6 ec=58/45 lis/c=95/65 les/c/f=96/66/0 sis=98 pruub=13.473112106s) [1] r=-1 lpr=98 pi=[65,98)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 168.610931396s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:37 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:37 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:37 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:37 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:37 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:43:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:37 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8004780 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:38 np0005548918 python3.9[88445]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:43:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:38.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:38 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:38.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Dec  6 04:43:39 np0005548918 python3.9[88602]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:43:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:39 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4001fe0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:39 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003f70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:40.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:40 np0005548918 python3.9[88756]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:43:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:40 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003f70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:40.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:41 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.e scrub starts
Dec  6 04:43:41 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.e scrub ok
Dec  6 04:43:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:41 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:41 np0005548918 python3.9[88911]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:43:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:42 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4001fe0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:42 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Dec  6 04:43:42 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Dec  6 04:43:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:42.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:42 np0005548918 python3.9[89064]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:43:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:42 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8004780 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:42.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:43 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Dec  6 04:43:43 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Dec  6 04:43:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:43 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003f90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:44 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:44 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 10.1 scrub starts
Dec  6 04:43:44 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 10.1 scrub ok
Dec  6 04:43:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:43:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:44.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:43:44 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:44 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca40032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:44.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:44 np0005548918 python3.9[89241]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:43:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Dec  6 04:43:44 np0005548918 network[89283]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:43:44 np0005548918 network[89284]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:43:44 np0005548918 network[89285]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:43:45 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 10.4 scrub starts
Dec  6 04:43:45 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 10.4 scrub ok
Dec  6 04:43:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:45 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8004780 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094345 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:43:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:46 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003fb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:46 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:46 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]: dispatch
Dec  6 04:43:46 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Dec  6 04:43:46 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:46 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]': finished
Dec  6 04:43:46 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Dec  6 04:43:46 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  6 04:43:46 np0005548918 systemd[81631]: Starting Mark boot as successful...
Dec  6 04:43:46 np0005548918 systemd[81631]: Finished Mark boot as successful.
Dec  6 04:43:46 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Dec  6 04:43:46 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Dec  6 04:43:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:46.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:46 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:46.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:47 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Dec  6 04:43:47 np0005548918 ceph-mon[75798]: Reconfiguring mon.compute-0 (monmap changed)...
Dec  6 04:43:47 np0005548918 ceph-mon[75798]: Reconfiguring daemon mon.compute-0 on compute-0
Dec  6 04:43:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]: dispatch
Dec  6 04:43:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Dec  6 04:43:47 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 10.14 scrub starts
Dec  6 04:43:47 np0005548918 ceph-osd[78376]: log_channel(cluster) log [DBG] : 10.14 scrub ok
Dec  6 04:43:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:47 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca40032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:48 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8004780 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:48.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:48 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8004780 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:48.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:49 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:50 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:50.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:50 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8003ff0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Dec  6 04:43:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:50.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:51 np0005548918 python3.9[89553]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:43:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:51 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8004780 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:51 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:52 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:52 np0005548918 python3.9[89704]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:43:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:52.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:52 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:52.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:53 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Dec  6 04:43:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Dec  6 04:43:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Dec  6 04:43:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.qhdjwa", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  6 04:43:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:53 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004010 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:53 np0005548918 python3.9[89859]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:43:53 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Dec  6 04:43:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:54 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8004780 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:54 np0005548918 ceph-mon[75798]: Reconfiguring mgr.compute-0.qhdjwa (monmap changed)...
Dec  6 04:43:54 np0005548918 ceph-mon[75798]: Reconfiguring daemon mgr.compute-0.qhdjwa on compute-0
Dec  6 04:43:54 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:54 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:54 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec  6 04:43:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:54.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:54 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:43:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:54.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:43:54 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Dec  6 04:43:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:55 np0005548918 ceph-mon[75798]: Reconfiguring crash.compute-0 (monmap changed)...
Dec  6 04:43:55 np0005548918 ceph-mon[75798]: Reconfiguring daemon crash.compute-0 on compute-0
Dec  6 04:43:55 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:55 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:55 np0005548918 ceph-mon[75798]: Reconfiguring osd.1 (monmap changed)...
Dec  6 04:43:55 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Dec  6 04:43:55 np0005548918 ceph-mon[75798]: Reconfiguring daemon osd.1 on compute-0
Dec  6 04:43:55 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:55 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:55 np0005548918 python3.9[90018]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:43:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:55 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:55 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:43:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:56 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004030 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:56 np0005548918 ceph-mon[75798]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Dec  6 04:43:56 np0005548918 ceph-mon[75798]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Dec  6 04:43:56 np0005548918 python3.9[90104]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:43:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:43:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:56.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:43:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:56 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8004780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:56 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:43:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:56.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:57 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:57 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:57 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:58 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Dec  6 04:43:58 np0005548918 ceph-mon[75798]: Reconfiguring grafana.compute-0 (dependencies changed)...
Dec  6 04:43:58 np0005548918 ceph-mon[75798]: Reconfiguring daemon grafana.compute-0 on compute-0
Dec  6 04:43:58 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Dec  6 04:43:58 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 106 pg[10.1f( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=85/85 les/c/f=86/86/0 sis=106) [2] r=0 lpr=106 pi=[85,106)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:58 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 106 pg[10.f( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=86/86 les/c/f=87/87/0 sis=106) [2] r=0 lpr=106 pi=[86,106)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:43:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:58.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:43:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:58 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:58 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:43:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:58 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:43:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:43:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:43:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:58.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:43:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Dec  6 04:43:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:58 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 107 pg[10.1f( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=85/85 les/c/f=86/86/0 sis=107) [2]/[1] r=-1 lpr=107 pi=[85,107)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:58 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 107 pg[10.1f( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=85/85 les/c/f=86/86/0 sis=107) [2]/[1] r=-1 lpr=107 pi=[85,107)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:58 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 107 pg[10.f( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=86/86 les/c/f=87/87/0 sis=107) [2]/[1] r=-1 lpr=107 pi=[86,107)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:43:58 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 107 pg[10.f( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=86/86 les/c/f=87/87/0 sis=107) [2]/[1] r=-1 lpr=107 pi=[86,107)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:43:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:59 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Dec  6 04:43:59 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:59 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:43:59 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec  6 04:43:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:43:59 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8004780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:43:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:43:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:43:59 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Dec  6 04:43:59 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 108 pg[10.10( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=108) [2] r=0 lpr=108 pi=[58,108)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:43:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:43:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:00 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:00 np0005548918 ceph-mon[75798]: Reconfiguring crash.compute-1 (monmap changed)...
Dec  6 04:44:00 np0005548918 ceph-mon[75798]: Reconfiguring daemon crash.compute-1 on compute-1
Dec  6 04:44:00 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Dec  6 04:44:00 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Dec  6 04:44:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:44:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:00.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:44:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:00 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:44:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:00.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:44:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Dec  6 04:44:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:00 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 109 pg[10.f( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=7 ec=58/45 lis/c=107/86 les/c/f=108/87/0 sis=109) [2] r=0 lpr=109 pi=[86,109)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:00 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 109 pg[10.f( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=7 ec=58/45 lis/c=107/86 les/c/f=108/87/0 sis=109) [2] r=0 lpr=109 pi=[86,109)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:00 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 109 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=107/85 les/c/f=108/86/0 sis=109) [2] r=0 lpr=109 pi=[85,109)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:00 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 109 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=107/85 les/c/f=108/86/0 sis=109) [2] r=0 lpr=109 pi=[85,109)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:00 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 109 pg[10.10( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[58,109)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:00 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 109 pg[10.10( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=109) [2]/[1] r=-1 lpr=109 pi=[58,109)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:01 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:01 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:01 np0005548918 ceph-mon[75798]: Reconfiguring osd.0 (monmap changed)...
Dec  6 04:44:01 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Dec  6 04:44:01 np0005548918 ceph-mon[75798]: Reconfiguring daemon osd.0 on compute-1
Dec  6 04:44:01 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:01 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:01 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:44:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:02 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8004780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:02 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:44:02 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Dec  6 04:44:02 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 110 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=109/110 n=5 ec=58/45 lis/c=107/85 les/c/f=108/86/0 sis=109) [2] r=0 lpr=109 pi=[85,109)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:44:02 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 110 pg[10.f( v 51'1027 (0'0,51'1027] local-lis/les=109/110 n=7 ec=58/45 lis/c=107/86 les/c/f=108/87/0 sis=109) [2] r=0 lpr=109 pi=[86,109)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:44:02 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:02 np0005548918 ceph-mon[75798]: Reconfiguring mon.compute-1 (monmap changed)...
Dec  6 04:44:02 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  6 04:44:02 np0005548918 ceph-mon[75798]: Reconfiguring daemon mon.compute-1 on compute-1
Dec  6 04:44:02 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Dec  6 04:44:02 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Dec  6 04:44:02 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:02 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:02 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  6 04:44:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:02.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:02 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:02.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:02 np0005548918 podman[90247]: 2025-12-06 09:44:02.763749841 +0000 UTC m=+0.068165429 container create 7cc48472a32d47f0f2a3ab69518c941d017239a7591f2b79b4f4b6312393e10f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jovial_ramanujan, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Dec  6 04:44:02 np0005548918 systemd[1]: Started libpod-conmon-7cc48472a32d47f0f2a3ab69518c941d017239a7591f2b79b4f4b6312393e10f.scope.
Dec  6 04:44:02 np0005548918 podman[90247]: 2025-12-06 09:44:02.723038171 +0000 UTC m=+0.027453789 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:44:02 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:44:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:02 np0005548918 podman[90247]: 2025-12-06 09:44:02.851228163 +0000 UTC m=+0.155643751 container init 7cc48472a32d47f0f2a3ab69518c941d017239a7591f2b79b4f4b6312393e10f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jovial_ramanujan, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:44:02 np0005548918 podman[90247]: 2025-12-06 09:44:02.856694629 +0000 UTC m=+0.161110187 container start 7cc48472a32d47f0f2a3ab69518c941d017239a7591f2b79b4f4b6312393e10f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jovial_ramanujan, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:44:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:02 np0005548918 podman[90247]: 2025-12-06 09:44:02.859758917 +0000 UTC m=+0.164174485 container attach 7cc48472a32d47f0f2a3ab69518c941d017239a7591f2b79b4f4b6312393e10f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jovial_ramanujan, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 04:44:02 np0005548918 jovial_ramanujan[90263]: 167 167
Dec  6 04:44:02 np0005548918 systemd[1]: libpod-7cc48472a32d47f0f2a3ab69518c941d017239a7591f2b79b4f4b6312393e10f.scope: Deactivated successfully.
Dec  6 04:44:02 np0005548918 podman[90247]: 2025-12-06 09:44:02.863657704 +0000 UTC m=+0.168073282 container died 7cc48472a32d47f0f2a3ab69518c941d017239a7591f2b79b4f4b6312393e10f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jovial_ramanujan, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec  6 04:44:02 np0005548918 systemd[1]: var-lib-containers-storage-overlay-31b3c2be719157600c575250501645dda3673309fdb5231865dab1967db22765-merged.mount: Deactivated successfully.
Dec  6 04:44:03 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Dec  6 04:44:03 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 111 pg[10.10( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=2 ec=58/45 lis/c=109/58 les/c/f=110/59/0 sis=111) [2] r=0 lpr=111 pi=[58,111)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:03 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 111 pg[10.10( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=2 ec=58/45 lis/c=109/58 les/c/f=110/59/0 sis=111) [2] r=0 lpr=111 pi=[58,111)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:03 np0005548918 podman[90247]: 2025-12-06 09:44:03.42943665 +0000 UTC m=+0.733852258 container remove 7cc48472a32d47f0f2a3ab69518c941d017239a7591f2b79b4f4b6312393e10f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jovial_ramanujan, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:44:03 np0005548918 ceph-mon[75798]: Reconfiguring mon.compute-2 (monmap changed)...
Dec  6 04:44:03 np0005548918 ceph-mon[75798]: Reconfiguring daemon mon.compute-2 on compute-2
Dec  6 04:44:03 np0005548918 systemd[1]: libpod-conmon-7cc48472a32d47f0f2a3ab69518c941d017239a7591f2b79b4f4b6312393e10f.scope: Deactivated successfully.
Dec  6 04:44:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:03 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:03 np0005548918 podman[90351]: 2025-12-06 09:44:03.952007483 +0000 UTC m=+0.044788333 container create dda25c9303a760066845b18fbe73b60f44ea620eb9b8ec7d4dd3e26366ea8508 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_mendel, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Dec  6 04:44:03 np0005548918 systemd[1]: Started libpod-conmon-dda25c9303a760066845b18fbe73b60f44ea620eb9b8ec7d4dd3e26366ea8508.scope.
Dec  6 04:44:04 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:44:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:04 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:04 np0005548918 podman[90351]: 2025-12-06 09:44:03.931284974 +0000 UTC m=+0.024065744 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:44:04 np0005548918 podman[90351]: 2025-12-06 09:44:04.032118481 +0000 UTC m=+0.124899291 container init dda25c9303a760066845b18fbe73b60f44ea620eb9b8ec7d4dd3e26366ea8508 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_mendel, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Dec  6 04:44:04 np0005548918 podman[90351]: 2025-12-06 09:44:04.039452145 +0000 UTC m=+0.132232865 container start dda25c9303a760066845b18fbe73b60f44ea620eb9b8ec7d4dd3e26366ea8508 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_mendel, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:44:04 np0005548918 podman[90351]: 2025-12-06 09:44:04.04205899 +0000 UTC m=+0.134839800 container attach dda25c9303a760066845b18fbe73b60f44ea620eb9b8ec7d4dd3e26366ea8508 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_mendel, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:44:04 np0005548918 musing_mendel[90368]: 167 167
Dec  6 04:44:04 np0005548918 systemd[1]: libpod-dda25c9303a760066845b18fbe73b60f44ea620eb9b8ec7d4dd3e26366ea8508.scope: Deactivated successfully.
Dec  6 04:44:04 np0005548918 podman[90351]: 2025-12-06 09:44:04.046244815 +0000 UTC m=+0.139025545 container died dda25c9303a760066845b18fbe73b60f44ea620eb9b8ec7d4dd3e26366ea8508 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_mendel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:44:04 np0005548918 systemd[1]: var-lib-containers-storage-overlay-c85761a8d3fca3ed7bc67cca4dce6a008112d84e4738a0feb10765380da8f824-merged.mount: Deactivated successfully.
Dec  6 04:44:04 np0005548918 podman[90351]: 2025-12-06 09:44:04.081083058 +0000 UTC m=+0.173863778 container remove dda25c9303a760066845b18fbe73b60f44ea620eb9b8ec7d4dd3e26366ea8508 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_mendel, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  6 04:44:04 np0005548918 systemd[1]: libpod-conmon-dda25c9303a760066845b18fbe73b60f44ea620eb9b8ec7d4dd3e26366ea8508.scope: Deactivated successfully.
Dec  6 04:44:04 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Dec  6 04:44:04 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 112 pg[10.12( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=112) [2] r=0 lpr=112 pi=[68,112)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:04 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 112 pg[10.10( v 51'1027 (0'0,51'1027] local-lis/les=111/112 n=2 ec=58/45 lis/c=109/58 les/c/f=110/59/0 sis=111) [2] r=0 lpr=111 pi=[58,111)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:44:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:04.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:04 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:04 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:04 np0005548918 ceph-mon[75798]: Reconfiguring mgr.compute-2.oazbvn (monmap changed)...
Dec  6 04:44:04 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.oazbvn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  6 04:44:04 np0005548918 ceph-mon[75798]: Reconfiguring daemon mgr.compute-2.oazbvn on compute-2
Dec  6 04:44:04 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Dec  6 04:44:04 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:04 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:04 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Dec  6 04:44:04 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:04 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Dec  6 04:44:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:04 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8004780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:44:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:04.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:44:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Dec  6 04:44:05 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 113 pg[10.12( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=113) [2]/[0] r=-1 lpr=113 pi=[68,113)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:05 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 113 pg[10.12( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=113) [2]/[0] r=-1 lpr=113 pi=[68,113)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:05 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:06 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Dec  6 04:44:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:06.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:06 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Dec  6 04:44:06 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 114 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=114 pruub=14.767874718s) [0] r=-1 lpr=114 pi=[65,114)/1 crt=51'1027 mlcod 0'0 active pruub 198.492263794s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:06 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 114 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=114 pruub=14.767754555s) [0] r=-1 lpr=114 pi=[65,114)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 198.492263794s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:06 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:44:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:06 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:44:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:06.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:44:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:07 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8004780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:07 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Dec  6 04:44:07 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Dec  6 04:44:07 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 115 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=4 ec=58/45 lis/c=113/68 les/c/f=114/69/0 sis=115) [2] r=0 lpr=115 pi=[68,115)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:07 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 115 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=4 ec=58/45 lis/c=113/68 les/c/f=114/69/0 sis=115) [2] r=0 lpr=115 pi=[68,115)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:07 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 115 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=115) [0]/[2] r=0 lpr=115 pi=[65,115)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:07 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 115 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=65/66 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=115) [0]/[2] r=0 lpr=115 pi=[65,115)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094407 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:44:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:08.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:08 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Dec  6 04:44:08 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 116 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=115/116 n=4 ec=58/45 lis/c=113/68 les/c/f=114/69/0 sis=115) [2] r=0 lpr=115 pi=[68,115)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:44:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  6 04:44:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:08.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  6 04:44:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:09 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 116 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=115/116 n=5 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=115) [0]/[2] async=[0] r=0 lpr=115 pi=[65,115)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:44:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:09 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:09 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:09 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:09 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:44:09 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:09 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:09 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:44:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:10 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8004780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Dec  6 04:44:10 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 117 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=115/116 n=5 ec=58/45 lis/c=115/65 les/c/f=116/66/0 sis=117 pruub=15.202275276s) [0] async=[0] r=-1 lpr=117 pi=[65,117)/1 crt=51'1027 mlcod 51'1027 active pruub 202.537994385s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:10 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 117 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=115/116 n=5 ec=58/45 lis/c=115/65 les/c/f=116/66/0 sis=117 pruub=15.202201843s) [0] r=-1 lpr=117 pi=[65,117)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 202.537994385s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:10.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:10 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:10.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:11 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Dec  6 04:44:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:11 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:11 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  6 04:44:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:12 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:12.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:12 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8004780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:12.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:13 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:14 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:14 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Dec  6 04:44:14 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 119 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=5 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=119 pruub=11.936896324s) [0] r=-1 lpr=119 pi=[74,119)/1 crt=51'1027 mlcod 0'0 active pruub 203.535644531s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:14 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 119 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=5 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=119 pruub=11.936861992s) [0] r=-1 lpr=119 pi=[74,119)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 203.535644531s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:14 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Dec  6 04:44:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:14.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:14 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:44:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:14.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:44:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Dec  6 04:44:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:15 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:15 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 120 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=5 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=120) [0]/[2] r=0 lpr=120 pi=[74,120)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:15 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 120 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=74/75 n=5 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=120) [0]/[2] r=0 lpr=120 pi=[74,120)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:15 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Dec  6 04:44:15 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:15 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:44:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:16 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:16.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:16 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:16 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:16 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Dec  6 04:44:16 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Dec  6 04:44:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:16.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:16 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 121 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=120/121 n=5 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[74,120)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:44:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:17 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:17 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Dec  6 04:44:17 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Dec  6 04:44:17 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 122 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=120/121 n=5 ec=58/45 lis/c=120/74 les/c/f=121/75/0 sis=122 pruub=14.982101440s) [0] async=[0] r=-1 lpr=122 pi=[74,122)/1 crt=51'1027 mlcod 51'1027 active pruub 209.963729858s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:17 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 122 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=120/121 n=5 ec=58/45 lis/c=120/74 les/c/f=121/75/0 sis=122 pruub=14.982007980s) [0] r=-1 lpr=122 pi=[74,122)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 209.963729858s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:18 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:18.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:18 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:18.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:18 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Dec  6 04:44:18 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Dec  6 04:44:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:19 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:20 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:20.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:20 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:20.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:21 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:21 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:21 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Dec  6 04:44:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:21 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Dec  6 04:44:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:22 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:22.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:22 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:22.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:22 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Dec  6 04:44:22 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Dec  6 04:44:22 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Dec  6 04:44:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:22 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Dec  6 04:44:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:23 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:23 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Dec  6 04:44:23 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Dec  6 04:44:23 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Dec  6 04:44:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:24 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4003340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:24.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:24 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:24.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:24 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Dec  6 04:44:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:25 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:25 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Dec  6 04:44:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Dec  6 04:44:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:26 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:26.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:26 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:26.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:26 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:26 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Dec  6 04:44:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Dec  6 04:44:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:27 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:28 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Dec  6 04:44:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:28 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:44:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:28.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:44:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:28 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:28.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:28 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Dec  6 04:44:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:29 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Dec  6 04:44:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:30 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:30.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:30 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:30.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:31 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:31 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:32 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:32.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:32 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:44:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:32.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:44:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:33 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:34 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc001320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Dec  6 04:44:34 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Dec  6 04:44:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:34.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:34 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:34.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Dec  6 04:44:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:35 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094435 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:44:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:36 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:36 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Dec  6 04:44:36 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Dec  6 04:44:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:36.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:36 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc001320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:36.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:36 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:37 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Dec  6 04:44:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:37 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:37 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Dec  6 04:44:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:38 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:38.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:38 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Dec  6 04:44:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:38.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:39 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc001320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:39 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Dec  6 04:44:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:40 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:40.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:40 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:40.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e137 e137: 3 total, 3 up, 3 in
Dec  6 04:44:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:41 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:41 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:42 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc0091b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:42.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:42 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:44:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:42.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:44:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:43 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:43 np0005548918 python3.9[90728]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:44:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:44 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:44 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:44:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:44 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Dec  6 04:44:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e138 e138: 3 total, 3 up, 3 in
Dec  6 04:44:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:44:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:44.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:44:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:44 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:44:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:44.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:44:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:45 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Dec  6 04:44:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:45 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:46 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:46 np0005548918 python3.9[91043]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec  6 04:44:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:46 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Dec  6 04:44:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e139 e139: 3 total, 3 up, 3 in
Dec  6 04:44:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:46.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:46 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc0091b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:44:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:46.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:44:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:47 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:44:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:47 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:44:47 np0005548918 python3.9[91196]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec  6 04:44:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:47 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:44:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Dec  6 04:44:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:47 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:48 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:48 np0005548918 python3.9[91349]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:44:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:48 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Dec  6 04:44:48 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e140 e140: 3 total, 3 up, 3 in
Dec  6 04:44:48 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 140 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=80/80 les/c/f=81/81/0 sis=140) [2] r=0 lpr=140 pi=[80,140)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:48.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:48 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:48.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:48 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e141 e141: 3 total, 3 up, 3 in
Dec  6 04:44:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:48 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 141 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=80/80 les/c/f=81/81/0 sis=141) [2]/[1] r=-1 lpr=141 pi=[80,141)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:48 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 141 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=80/80 les/c/f=81/81/0 sis=141) [2]/[1] r=-1 lpr=141 pi=[80,141)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:49 np0005548918 python3.9[91501]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec  6 04:44:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:49 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Dec  6 04:44:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:49 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc0091b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:49 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e142 e142: 3 total, 3 up, 3 in
Dec  6 04:44:49 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 142 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=109/110 n=5 ec=58/45 lis/c=109/109 les/c/f=110/110/0 sis=142 pruub=8.193440437s) [1] r=-1 lpr=142 pi=[109,142)/1 crt=51'1027 mlcod 0'0 active pruub 235.314407349s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:49 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 142 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=109/110 n=5 ec=58/45 lis/c=109/109 les/c/f=110/110/0 sis=142 pruub=8.193408966s) [1] r=-1 lpr=142 pi=[109,142)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 235.314407349s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:50 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:50 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:44:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:50 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  6 04:44:50 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  6 04:44:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:50.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:50 np0005548918 python3.9[91655]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:44:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:50 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:44:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:50.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:44:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e143 e143: 3 total, 3 up, 3 in
Dec  6 04:44:50 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 143 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=109/110 n=5 ec=58/45 lis/c=109/109 les/c/f=110/110/0 sis=143) [1]/[2] r=0 lpr=143 pi=[109,143)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:50 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 143 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=109/110 n=5 ec=58/45 lis/c=109/109 les/c/f=110/110/0 sis=143) [1]/[2] r=0 lpr=143 pi=[109,143)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:50 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 143 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=141/80 les/c/f=142/81/0 sis=143) [2] r=0 lpr=143 pi=[80,143)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:50 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 143 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=141/80 les/c/f=142/81/0 sis=143) [2] r=0 lpr=143 pi=[80,143)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  6 04:44:51 np0005548918 python3.9[91808]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:44:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:51 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:51 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:51 np0005548918 python3.9[91886]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:44:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:52 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc0091b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:52 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e144 e144: 3 total, 3 up, 3 in
Dec  6 04:44:52 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 144 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=143/144 n=5 ec=58/45 lis/c=141/80 les/c/f=142/81/0 sis=143) [2] r=0 lpr=143 pi=[80,143)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:44:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:52.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:52 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb80040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:52.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:52 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 144 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=143/144 n=5 ec=58/45 lis/c=109/109 les/c/f=110/110/0 sis=143) [1]/[2] async=[1] r=0 lpr=143 pi=[109,143)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  6 04:44:53 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e145 e145: 3 total, 3 up, 3 in
Dec  6 04:44:53 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 145 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=143/144 n=5 ec=58/45 lis/c=143/109 les/c/f=144/110/0 sis=145 pruub=15.831141472s) [1] async=[1] r=-1 lpr=145 pi=[109,145)/1 crt=51'1027 mlcod 51'1027 active pruub 246.209091187s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  6 04:44:53 np0005548918 ceph-osd[78376]: osd.2 pg_epoch: 145 pg[10.1f( v 51'1027 (0'0,51'1027] local-lis/les=143/144 n=5 ec=58/45 lis/c=143/109 les/c/f=144/110/0 sis=145 pruub=15.831076622s) [1] r=-1 lpr=145 pi=[109,145)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 246.209091187s@ mbc={}] state<Start>: transitioning to Stray
Dec  6 04:44:53 np0005548918 python3.9[92040]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:44:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:53 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:54 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:54 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 e146: 3 total, 3 up, 3 in
Dec  6 04:44:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:54.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:54 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc0091d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:54 np0005548918 python3.9[92195]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec  6 04:44:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:44:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:54.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:44:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:55 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb80040d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:55 np0005548918 python3.9[92349]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec  6 04:44:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094455 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:44:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:56 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:56.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:56 np0005548918 python3.9[92503]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  6 04:44:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:56 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4003d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:56.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:56 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:44:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:57 np0005548918 python3.9[92656]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec  6 04:44:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:57 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc0091f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:58 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb80040f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:58.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:58 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:44:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:44:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:58.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:44:58 np0005548918 python3.9[92809]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:44:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:44:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:44:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:44:59 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4004390 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:44:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:44:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:00 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a700 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:00.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:00 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004110 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:00.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:01 np0005548918 python3.9[92964]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:45:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:01 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:01 np0005548918 python3.9[93117]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:45:01 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:02 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ca4004390 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:02 np0005548918 python3.9[93196]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:45:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:02.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:02 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cdc00a700 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:45:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:02.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:45:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:03 np0005548918 python3.9[93348]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:45:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:03 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004130 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:03 np0005548918 python3.9[93427]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:45:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:04 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:04.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:04 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8002860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:04.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:04 np0005548918 python3.9[93582]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:45:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:05 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cac001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:06 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb80041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:06.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:06 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:06.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:06 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:07 np0005548918 python3.9[93761]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:45:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:07 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8002860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cac001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:08 np0005548918 python3.9[93914]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec  6 04:45:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:08.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:08 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004200 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:08.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:08 np0005548918 python3.9[94064]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:45:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:09 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:10 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cd8002860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:10 np0005548918 python3.9[94218]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:45:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:10.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:10 np0005548918 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec  6 04:45:10 np0005548918 systemd[1]: tuned.service: Deactivated successfully.
Dec  6 04:45:10 np0005548918 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec  6 04:45:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:10 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cac001e30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:10 np0005548918 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  6 04:45:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:10.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:10 np0005548918 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  6 04:45:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:11 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb8004220 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:11 np0005548918 python3.9[94380]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec  6 04:45:11 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:12 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:12.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:12 np0005548918 kernel: ganesha.nfsd[90179]: segfault at 50 ip 00007f8d8bcdb32e sp 00007f8d43ffe210 error 4 in libntirpc.so.5.8[7f8d8bcc0000+2c000] likely on CPU 5 (core 0, socket 5)
Dec  6 04:45:12 np0005548918 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 04:45:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[84652]: 06/12/2025 09:45:12 : epoch 6933fa70 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cb4004590 fd 48 proxy ignored for local
Dec  6 04:45:12 np0005548918 systemd[1]: Created slice Slice /system/systemd-coredump.
Dec  6 04:45:12 np0005548918 systemd[1]: Started Process Core Dump (PID 94406/UID 0).
Dec  6 04:45:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:12.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:13 np0005548918 systemd-coredump[94407]: Process 84656 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 63:#012#0  0x00007f8d8bcdb32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 04:45:13 np0005548918 systemd[1]: systemd-coredump@0-94406-0.service: Deactivated successfully.
Dec  6 04:45:13 np0005548918 systemd[1]: systemd-coredump@0-94406-0.service: Consumed 1.077s CPU time.
Dec  6 04:45:13 np0005548918 podman[94414]: 2025-12-06 09:45:13.842068624 +0000 UTC m=+0.025171896 container died a3634fe4060dc94c2c20aff61ae4ab07f3ae7c7af9e41801a8e759fad2a4f938 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 04:45:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:13 np0005548918 systemd[1]: var-lib-containers-storage-overlay-3e0607e97819625324c74760b6caea3cab529c87137413bc124ce71f1d8ada59-merged.mount: Deactivated successfully.
Dec  6 04:45:13 np0005548918 podman[94414]: 2025-12-06 09:45:13.882121249 +0000 UTC m=+0.065224481 container remove a3634fe4060dc94c2c20aff61ae4ab07f3ae7c7af9e41801a8e759fad2a4f938 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  6 04:45:13 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Main process exited, code=exited, status=139/n/a
Dec  6 04:45:14 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Failed with result 'exit-code'.
Dec  6 04:45:14 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.830s CPU time.
Dec  6 04:45:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:14.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:14.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:15 np0005548918 python3.9[94585]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:45:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:15 np0005548918 python3.9[94740]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:45:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:16.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:16.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:16 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:16 np0005548918 systemd[1]: session-38.scope: Deactivated successfully.
Dec  6 04:45:16 np0005548918 systemd[1]: session-38.scope: Consumed 1min 3.530s CPU time.
Dec  6 04:45:16 np0005548918 systemd-logind[800]: Session 38 logged out. Waiting for processes to exit.
Dec  6 04:45:16 np0005548918 systemd-logind[800]: Removed session 38.
Dec  6 04:45:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:18.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094518 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:45:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:18.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:19 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:45:19 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:45:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:20 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:45:20 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:45:20 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:45:20 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:45:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:20.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:20.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:21 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:22.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:22 np0005548918 systemd-logind[800]: New session 39 of user zuul.
Dec  6 04:45:22 np0005548918 systemd[1]: Started Session 39 of User zuul.
Dec  6 04:45:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:22.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:23 np0005548918 python3.9[95008]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:45:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:24 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Scheduled restart job, restart counter is at 1.
Dec  6 04:45:24 np0005548918 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:45:24 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.830s CPU time.
Dec  6 04:45:24 np0005548918 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:45:24 np0005548918 podman[95085]: 2025-12-06 09:45:24.285990075 +0000 UTC m=+0.035606415 container create 5bbd0707ce20ed32133f52fc5be40478c3e1dee6c3214441b79974078459fdb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:45:24 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3c285c64e673bac02ec67e13d83178d050b1449d57c31d970b1d4055eba249c/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 04:45:24 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3c285c64e673bac02ec67e13d83178d050b1449d57c31d970b1d4055eba249c/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:45:24 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3c285c64e673bac02ec67e13d83178d050b1449d57c31d970b1d4055eba249c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:45:24 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3c285c64e673bac02ec67e13d83178d050b1449d57c31d970b1d4055eba249c/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.sseuqb-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:45:24 np0005548918 podman[95085]: 2025-12-06 09:45:24.337406953 +0000 UTC m=+0.087023203 container init 5bbd0707ce20ed32133f52fc5be40478c3e1dee6c3214441b79974078459fdb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  6 04:45:24 np0005548918 podman[95085]: 2025-12-06 09:45:24.344090239 +0000 UTC m=+0.093706469 container start 5bbd0707ce20ed32133f52fc5be40478c3e1dee6c3214441b79974078459fdb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  6 04:45:24 np0005548918 bash[95085]: 5bbd0707ce20ed32133f52fc5be40478c3e1dee6c3214441b79974078459fdb8
Dec  6 04:45:24 np0005548918 podman[95085]: 2025-12-06 09:45:24.269972585 +0000 UTC m=+0.019588825 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:45:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 04:45:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 04:45:24 np0005548918 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:45:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 04:45:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 04:45:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 04:45:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 04:45:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 04:45:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.480886) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324480934, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2967, "num_deletes": 252, "total_data_size": 10713369, "memory_usage": 11054976, "flush_reason": "Manual Compaction"}
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324544169, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 6722042, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7757, "largest_seqno": 10719, "table_properties": {"data_size": 6708894, "index_size": 8490, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3589, "raw_key_size": 31688, "raw_average_key_size": 22, "raw_value_size": 6680720, "raw_average_value_size": 4688, "num_data_blocks": 370, "num_entries": 1425, "num_filter_entries": 1425, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014196, "oldest_key_time": 1765014196, "file_creation_time": 1765014324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 63343 microseconds, and 11415 cpu microseconds.
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:45:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:24.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.544236) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 6722042 bytes OK
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.544255) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.547085) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.547101) EVENT_LOG_v1 {"time_micros": 1765014324547097, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.547117) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 10699092, prev total WAL file size 10752907, number of live WAL files 2.
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.549151) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(6564KB)], [18(12MB)]
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324549246, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 19465185, "oldest_snapshot_seqno": -1}
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4124 keys, 14793173 bytes, temperature: kUnknown
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324711060, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 14793173, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14759435, "index_size": 22300, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10373, "raw_key_size": 105187, "raw_average_key_size": 25, "raw_value_size": 14677821, "raw_average_value_size": 3559, "num_data_blocks": 957, "num_entries": 4124, "num_filter_entries": 4124, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765014324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.711313) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 14793173 bytes
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.713659) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 120.2 rd, 91.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(6.4, 12.2 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(5.1) write-amplify(2.2) OK, records in: 4660, records dropped: 536 output_compression: NoCompression
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.713677) EVENT_LOG_v1 {"time_micros": 1765014324713669, "job": 8, "event": "compaction_finished", "compaction_time_micros": 161887, "compaction_time_cpu_micros": 34010, "output_level": 6, "num_output_files": 1, "total_output_size": 14793173, "num_input_records": 4660, "num_output_records": 4124, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324714937, "job": 8, "event": "table_file_deletion", "file_number": 20}
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324717213, "job": 8, "event": "table_file_deletion", "file_number": 18}
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.549023) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.717382) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.717390) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.717394) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.717397) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:45:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:45:24.717400) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:45:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:24.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:25 np0005548918 python3.9[95295]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec  6 04:45:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:25 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:45:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:26 np0005548918 python3.9[95475]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:45:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:26.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:45:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:26.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:45:26 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:26 np0005548918 python3.9[95559]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  6 04:45:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:28.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:28.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:29 np0005548918 python3.9[95715]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:45:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:45:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:45:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:30.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:30.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:31 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:31 np0005548918 python3.9[95870]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:45:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:32.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:32.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:32 np0005548918 python3.9[96024]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:45:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:33 np0005548918 python3.9[96177]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec  6 04:45:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:34.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:34.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:34 np0005548918 python3.9[96328]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:45:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094535 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:45:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:35 np0005548918 python3.9[96487]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:45:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:36.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:36.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:36 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:37 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058001c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:38 np0005548918 python3.9[96658]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:45:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:38.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094538 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:45:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:38.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:39 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:39 np0005548918 python3.9[96946]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  6 04:45:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:40 np0005548918 python3.9[97097]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:45:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:40.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:40.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:41 np0005548918 python3.9[97251]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:45:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:41 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:41 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:42.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:42.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:43 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:43 np0005548918 python3.9[97407]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:45:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:44.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:44.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:45 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:46 np0005548918 python3.9[97588]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:45:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:46.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:46.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:45:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:47 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:47 np0005548918 python3.9[97744]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Dec  6 04:45:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:48.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:48.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:49 np0005548918 systemd[1]: session-39.scope: Deactivated successfully.
Dec  6 04:45:49 np0005548918 systemd[1]: session-39.scope: Consumed 17.900s CPU time.
Dec  6 04:45:49 np0005548918 systemd-logind[800]: Session 39 logged out. Waiting for processes to exit.
Dec  6 04:45:49 np0005548918 systemd-logind[800]: Removed session 39.
Dec  6 04:45:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:49 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:49 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:45:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:49 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:45:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:50.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:50.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:51 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:51 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:52.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:52.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:53 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:45:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:53 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:54.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:54.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:55 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:56.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:45:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:56.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:45:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:56 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:45:57 np0005548918 systemd[1]: session-19.scope: Deactivated successfully.
Dec  6 04:45:57 np0005548918 systemd[1]: session-19.scope: Consumed 8.794s CPU time.
Dec  6 04:45:57 np0005548918 systemd-logind[800]: Session 19 logged out. Waiting for processes to exit.
Dec  6 04:45:57 np0005548918 systemd-logind[800]: Removed session 19.
Dec  6 04:45:57 np0005548918 systemd-logind[800]: New session 40 of user zuul.
Dec  6 04:45:57 np0005548918 systemd[1]: Started Session 40 of User zuul.
Dec  6 04:45:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:57 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003c60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:58 np0005548918 python3.9[97933]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:45:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:58.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:45:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:45:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:58.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:45:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094559 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:45:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:45:59 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:45:59 np0005548918 python3.9[98088]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:45:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:45:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:45:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:45:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003c60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:00.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:00.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:01 np0005548918 python3.9[98282]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:46:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:01 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:01 np0005548918 systemd-logind[800]: Session 40 logged out. Waiting for processes to exit.
Dec  6 04:46:01 np0005548918 systemd[1]: session-40.scope: Deactivated successfully.
Dec  6 04:46:01 np0005548918 systemd[1]: session-40.scope: Consumed 2.552s CPU time.
Dec  6 04:46:01 np0005548918 systemd-logind[800]: Removed session 40.
Dec  6 04:46:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:01 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:02.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003c60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:02.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:03 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:04.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:04.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:05 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003c60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:06.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:06.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:06 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:07 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:08.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:08 np0005548918 systemd-logind[800]: New session 41 of user zuul.
Dec  6 04:46:08 np0005548918 systemd[1]: Started Session 41 of User zuul.
Dec  6 04:46:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:08.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:09 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:10 np0005548918 python3.9[98497]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:46:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003c60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:10.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:10.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:11 np0005548918 python3.9[98652]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:46:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:11 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:11 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:12 np0005548918 python3.9[98810]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:46:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:12.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003c60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:12.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:13 np0005548918 python3.9[98894]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:46:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:13 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:14.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:46:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:14.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:46:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:15 np0005548918 python3.9[99049]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:46:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:15 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003c60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:16.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:16 np0005548918 python3.9[99246]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:46:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:16.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:16 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:17 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:17 np0005548918 python3.9[99399]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:46:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:18.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:18 np0005548918 python3.9[99562]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:46:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:18.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:19 np0005548918 python3.9[99640]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:46:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:19 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:19 np0005548918 python3.9[99793]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:46:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:20 np0005548918 python3.9[99872]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:46:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:20.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003c60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:20.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:21 np0005548918 python3.9[100025]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:46:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:21 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:21 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:22 np0005548918 python3.9[100178]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:46:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:22.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:22 np0005548918 python3.9[100330]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:46:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:22.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:23 np0005548918 python3.9[100483]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:46:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:23 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003c60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:24 np0005548918 python3.9[100636]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:46:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:24.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:24.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:25 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:26.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:26 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:46:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:26.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:26 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:27 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:27 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:46:27 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:46:27 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:46:27 np0005548918 python3.9[100897]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:46:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:28 np0005548918 python3.9[101052]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:46:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:28.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c004580 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:28.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:29 np0005548918 python3.9[101205]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:46:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:29 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:30 np0005548918 python3.9[101358]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:46:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:30.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c004580 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:30.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:31 np0005548918 python3.9[101512]: ansible-service_facts Invoked
Dec  6 04:46:31 np0005548918 network[101529]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:46:31 np0005548918 network[101530]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:46:31 np0005548918 network[101531]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:46:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:31 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:32 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:32.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:32.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:32 np0005548918 ceph-mds[84319]: mds.beacon.cephfs.compute-2.czucwy missed beacon ack from the monitors
Dec  6 04:46:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:33 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c004580 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:34.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:34.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:35 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c004580 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:36.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:36.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:37 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:37 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:38.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:38.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:39 np0005548918 python3.9[101991]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:46:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:39 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:40.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:40.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:41 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:46:41 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:46:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:41 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:42 np0005548918 python3.9[102174]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec  6 04:46:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:42.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030000d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:42 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:42.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094643 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:46:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:43 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:43 np0005548918 python3.9[102327]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:46:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:44 np0005548918 python3.9[102406]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:46:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:44.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:44.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:45 np0005548918 python3.9[102558]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:46:45 np0005548918 python3.9[102637]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:46:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:45 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:46 np0005548918 systemd[81631]: Created slice User Background Tasks Slice.
Dec  6 04:46:46 np0005548918 systemd[81631]: Starting Cleanup of User's Temporary Files and Directories...
Dec  6 04:46:46 np0005548918 systemd[81631]: Finished Cleanup of User's Temporary Files and Directories.
Dec  6 04:46:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:46.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:46.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:47 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:47 np0005548918 python3.9[102817]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:46:47 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:48.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:48.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:49 np0005548918 python3.9[102971]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:46:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:49 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:50 np0005548918 python3.9[103056]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:46:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:50.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:50.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:51 np0005548918 systemd[1]: session-41.scope: Deactivated successfully.
Dec  6 04:46:51 np0005548918 systemd[1]: session-41.scope: Consumed 24.262s CPU time.
Dec  6 04:46:51 np0005548918 systemd-logind[800]: Session 41 logged out. Waiting for processes to exit.
Dec  6 04:46:51 np0005548918 systemd-logind[800]: Removed session 41.
Dec  6 04:46:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:51 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c0031d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:52.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:52 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:52.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:53 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:53 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:46:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c0031d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:46:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:54.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:46:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:54.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:55 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300030a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:56.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c0031d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:56.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:46:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:46:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:57 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:57 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:46:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:58 np0005548918 systemd-logind[800]: New session 42 of user zuul.
Dec  6 04:46:58 np0005548918 systemd[1]: Started Session 42 of User zuul.
Dec  6 04:46:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300030a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:46:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:58.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:46:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:46:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:46:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:58.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:46:59 np0005548918 python3.9[103247]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:46:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:46:59 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:46:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:46:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:46:59 np0005548918 python3.9[103400]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:46:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:46:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094700 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:47:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:00 np0005548918 python3.9[103479]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:00.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:00.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:00 np0005548918 systemd[1]: session-42.scope: Deactivated successfully.
Dec  6 04:47:00 np0005548918 systemd[1]: session-42.scope: Consumed 1.513s CPU time.
Dec  6 04:47:00 np0005548918 systemd-logind[800]: Session 42 logged out. Waiting for processes to exit.
Dec  6 04:47:00 np0005548918 systemd-logind[800]: Removed session 42.
Dec  6 04:47:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:01 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:02.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:02 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:02.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:03 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:04.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:47:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:04.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:47:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:05 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:47:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:47:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:06.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:06.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:07 np0005548918 systemd-logind[800]: New session 43 of user zuul.
Dec  6 04:47:07 np0005548918 systemd[1]: Started Session 43 of User zuul.
Dec  6 04:47:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:07 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:07 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:08 np0005548918 python3.9[103690]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:47:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:08.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:47:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:08.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:47:09 np0005548918 python3.9[103847]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:09 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:09 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:47:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:10 np0005548918 python3.9[104023]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:10.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:10 np0005548918 python3.9[104102]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.kr_pinj0 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:10.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:11 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:11 np0005548918 python3.9[104256]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:12 np0005548918 python3.9[104335]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.i40k4vvi recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:47:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:12.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:12 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:47:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:12.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:47:13 np0005548918 python3.9[104487]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:47:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:13 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:13 np0005548918 python3.9[104640]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:14 np0005548918 python3.9[104719]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:47:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:47:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:14.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:47:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:14 np0005548918 python3.9[104871]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:14.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:15 np0005548918 python3.9[104950]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:47:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094715 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:47:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:15 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:47:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:15 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:47:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:15 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:16 np0005548918 python3.9[105103]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:47:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:16.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:47:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:16.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:17 np0005548918 python3.9[105255]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:17 np0005548918 python3.9[105334]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:17 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:17 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:18 np0005548918 python3.9[105487]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:18.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:18 np0005548918 python3.9[105565]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:47:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:18.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:19 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:20 np0005548918 python3.9[105719]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:47:20 np0005548918 systemd[1]: Reloading.
Dec  6 04:47:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:20 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:47:20 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:47:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:20.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:20.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:21 np0005548918 python3.9[105909]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:21 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:21 np0005548918 python3.9[105988]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094722 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:47:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:47:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:22.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:47:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:22 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:22.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:23 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:23 np0005548918 python3.9[106141]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:24 np0005548918 python3.9[106220]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:24.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:24.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:25 np0005548918 python3.9[106372]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:47:25 np0005548918 systemd[1]: Reloading.
Dec  6 04:47:25 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:47:25 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:47:25 np0005548918 systemd[1]: Starting Create netns directory...
Dec  6 04:47:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:25 np0005548918 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  6 04:47:25 np0005548918 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  6 04:47:25 np0005548918 systemd[1]: Finished Create netns directory.
Dec  6 04:47:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:25 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:26 np0005548918 python3.9[106591]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:47:26 np0005548918 network[106608]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:47:26 np0005548918 network[106609]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:47:26 np0005548918 network[106610]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:47:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:26.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:26.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:27 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:47:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:28.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:47:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:28.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:29 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:47:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:30.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:47:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:30.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:31 np0005548918 python3.9[106877]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:31 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:31 np0005548918 python3.9[106955]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:32.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:32 np0005548918 python3.9[107108]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:32 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:32.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094733 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:47:33 np0005548918 python3.9[107261]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:33 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:33 np0005548918 python3.9[107340]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:47:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:34.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:47:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.002000052s ======
Dec  6 04:47:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:34.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Dec  6 04:47:35 np0005548918 python3.9[107493]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  6 04:47:35 np0005548918 systemd[1]: Starting Time & Date Service...
Dec  6 04:47:35 np0005548918 systemd[1]: Started Time & Date Service.
Dec  6 04:47:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:35 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:36 np0005548918 python3.9[107650]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:36.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:36 np0005548918 python3.9[107802]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:47:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:36.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:47:37 np0005548918 python3.9[107881]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:37 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:37 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:38 np0005548918 python3.9[108034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:38 np0005548918 python3.9[108112]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.9m3iow5l recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:38.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:38.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:39 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:39 np0005548918 python3.9[108265]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:40 np0005548918 python3.9[108344]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:40.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:40.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:41 np0005548918 python3.9[108575]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:47:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:41 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:41 np0005548918 python3[108732]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  6 04:47:42 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:47:42 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:47:42 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:47:42 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:47:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:47:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:42.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:42 np0005548918 python3.9[108886]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:42 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:42.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:43 np0005548918 python3.9[108964]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:43 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:44 np0005548918 python3.9[109118]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:44 np0005548918 python3.9[109196]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:47:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:44.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:47:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:44.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:45 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:47:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:45 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:47:45 np0005548918 python3.9[109349]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:45 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:47:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:46 np0005548918 python3.9[109428]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:46.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:46.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:47 np0005548918 python3.9[109605]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:47 np0005548918 python3.9[109684]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:47 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:47 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:48 np0005548918 python3.9[109837]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:47:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:47:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:48.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:47:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:48 np0005548918 python3.9[109915]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:48.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:49 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.385937) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469386000, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1514, "num_deletes": 250, "total_data_size": 3825449, "memory_usage": 3880240, "flush_reason": "Manual Compaction"}
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469486930, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1471945, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10724, "largest_seqno": 12233, "table_properties": {"data_size": 1467153, "index_size": 2188, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12146, "raw_average_key_size": 20, "raw_value_size": 1456834, "raw_average_value_size": 2407, "num_data_blocks": 97, "num_entries": 605, "num_filter_entries": 605, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014324, "oldest_key_time": 1765014324, "file_creation_time": 1765014469, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 101106 microseconds, and 8038 cpu microseconds.
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.487043) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1471945 bytes OK
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.487075) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.492114) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.492147) EVENT_LOG_v1 {"time_micros": 1765014469492139, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.492174) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 3818489, prev total WAL file size 3818489, number of live WAL files 2.
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.493616) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1437KB)], [21(14MB)]
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469493703, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 16265118, "oldest_snapshot_seqno": -1}
Dec  6 04:47:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:49 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4280 keys, 14215680 bytes, temperature: kUnknown
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469688410, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 14215680, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14182783, "index_size": 21075, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 108806, "raw_average_key_size": 25, "raw_value_size": 14100342, "raw_average_value_size": 3294, "num_data_blocks": 902, "num_entries": 4280, "num_filter_entries": 4280, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765014469, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.688697) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 14215680 bytes
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.691618) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 83.5 rd, 73.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 14.1 +0.0 blob) out(13.6 +0.0 blob), read-write-amplify(20.7) write-amplify(9.7) OK, records in: 4729, records dropped: 449 output_compression: NoCompression
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.691657) EVENT_LOG_v1 {"time_micros": 1765014469691638, "job": 10, "event": "compaction_finished", "compaction_time_micros": 194785, "compaction_time_cpu_micros": 54442, "output_level": 6, "num_output_files": 1, "total_output_size": 14215680, "num_input_records": 4729, "num_output_records": 4280, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469692215, "job": 10, "event": "table_file_deletion", "file_number": 23}
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469694792, "job": 10, "event": "table_file_deletion", "file_number": 21}
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.493466) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.694848) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.694855) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.694858) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.694861) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:47:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:47:49.694863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:47:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:50 np0005548918 python3.9[110069]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:47:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300013d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:50.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:50 np0005548918 python3.9[110224]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:47:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:50.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:47:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:51 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:51 np0005548918 python3.9[110377]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:52 np0005548918 python3.9[110555]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:47:52 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:47:52 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:47:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:52.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:52 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:53.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:53 np0005548918 python3.9[110710]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  6 04:47:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:53 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:54 np0005548918 python3.9[110863]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  6 04:47:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:54 np0005548918 systemd[1]: session-43.scope: Deactivated successfully.
Dec  6 04:47:54 np0005548918 systemd[1]: session-43.scope: Consumed 29.592s CPU time.
Dec  6 04:47:54 np0005548918 systemd-logind[800]: Session 43 logged out. Waiting for processes to exit.
Dec  6 04:47:54 np0005548918 systemd-logind[800]: Removed session 43.
Dec  6 04:47:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:54.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:55.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094755 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:47:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:55 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:47:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:56.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:47:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:57.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:57 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:57 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:47:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:58.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:47:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:47:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:59.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:47:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:47:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:47:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:47:59 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:47:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:47:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:00 np0005548918 systemd-logind[800]: New session 44 of user zuul.
Dec  6 04:48:00 np0005548918 systemd[1]: Started Session 44 of User zuul.
Dec  6 04:48:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:00.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:01.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:01 np0005548918 python3.9[111050]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec  6 04:48:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:01 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:02 np0005548918 python3.9[111203]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:48:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:02.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:02 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:03.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:03 np0005548918 python3.9[111358]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Dec  6 04:48:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:03 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:03 np0005548918 python3.9[111510]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.alrrm9sx follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000a890 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:04 np0005548918 python3.9[111636]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.alrrm9sx mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014483.5059352-104-167851636529233/.source.alrrm9sx _original_basename=.lcflw2m4 follow=False checksum=741dc69011fb61b699872c865e152b9968457717 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:04.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:05.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:05 np0005548918 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  6 04:48:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:05 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:05 np0005548918 python3.9[111791]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:48:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030004200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:06 np0005548918 python3.9[111969]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDneZurSARwLaZA1xEymzXlvVAPvP8u0PCrqXuMYD5ewImDDChRITnk4XHKT/DUfrSJf9/7oJsddEbLRjhCtedqrMZsCkWz1BxtCmPBuvz2LfFhEn27TjqYLctOVGigQGsj6ILvPOzzLiapd93yApWDmH6P0un/ltmdM0iZLygNpzG3HLF8STBXzlo/8slci69Em7XppcrOpl1TS7DaVlpNcRQvo9pFuIrbMD9g0DOdMwk5YCH6g7OzGWqq0gt0YUOztmsqxWHKav3E0SXAD/vkgRc/1ZCNGFNSvf0dIgimCF3xlNWrppnvNgQ1BRqiQ7RArlOp1bVg0Ugdce6f4TIrq36Ois2U5+/myF5WQ7l9hRMRvoP64hSSsRAIDobTI/zMStUP3iZPFngxDxwQtpydHfFGywBL9811c42U7JsGxE8890uOIDk/oOkyhSH6KHQCPFjmKBJ98nT01lgnXyFSNOqds6QOYBasUWNFWd2wS7YpTheGlVVM8bk/gB4K2L0=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOMkn8zp09tRuEaH/bUoP0rYj+dziM1KcqMKxOgM9K1U#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCrMdvJJYP0cflC7RDFsxwr66nSp9R7QU726CAfJcKLw6vHh8Z9Lw5wLH0kiaSpsb6SAPffloplHEDiwTOkghOc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAiB67qk/R3IfGpcAH1Ojopc8KX94De+Kxs31cKQLD04X+4QRXPRdMxU85LOhN58eKoHaBi8cgqk7+dvRypGD5vbtbRN9r0VN7tGwiSQTlVFbEuhn0AEbnRwNAMWEEMHO9kEjufP4N2zEEhtQBXy9oO2tMX3+BX4Z3YZZMQyZUgohdBHp2VCul9VdRuo0oHSr8HHm0nN61dMjalnThmgkGAu5hG8qhkWT4i9hroSKBsR5kVBUFTqdXekYkVy4YIYfM2lBXiMOFHtvr1a+KOyIfgWMb7GBPW7oKqtzCfVgSbGaUhSvGzs1OWt3U/PjjapIlmDnwD5ukzVxWV5ldh0vA48tXh5R1wqAoN5/Y/RiAKaY2kd/fvtkhvVDGZluXOz5jJ02IFHm+v4dP3Ig8YOuS5BEkWFuJHkblW0t/+4siTHWwmGEuvUI6y8Gb2pGcBKsWCJtLePYzT09IAmrjwO0jAgbWy0nvCZ+SKlbBBrXP6OgNgMkA+GH9iGOl6FOuRok=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGYNj3LmNvR0emoQHuuy9NKXPivs/dznunVy8GExnJl8#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJhKmGSvg8FMw16qKPzk6Pyj+OHkN3bmk20mts1PdCRcNRnn9sT1DgI6U8Aze1tjGPujT4eDL+Y9r/hsrfM4qDc=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCtvqYC0W0zPSX/plyJvm0q1VGDScYTNlcCdllukOe81JRfU3GhVusPZOX0xRSaLP/lmXtfqWcbBRCkLsmFrAo2EHn1CMqMr5WkhY4+rgApF+MGLDOUo57tlKZLPIwdL0SSY/Qv8lBfrqr7LUDZ7fTTTbqTzim/bncxg/u0KxSWBdvjfmYi13SwO65wDkFqSVYa3h8DNij6cRRjQ0fJuJ9Da860hmMnqo9GJMU6dq3zMXXn3YfuF4E4M0UQdlWmVW4EwBTzsfA1XYbSpW7VdRJw6esB4vZ9/Succj+XZiANoDqL9gXSEjNXVVWVbL/7aGJJF9LLQ3VVxmHdbYs1NcTI6Yy9d61zDJHnK/nlYHMhmAHxiDsZEpv0xF72LLzaI86xxvnbx4eUpnyW6LnKiUCYUAUrWIMpLiIbWUxeIoYmj9rqLhwlo5kCy7WdCYYEMTtGI53oIyU0EbXf/r4WAuzmqpVRPyc2Sd5tYD4aXh1JZLUcZy+NLR0Y4SA8RflKFcs=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFDJYF6pUvFgGUbY2QEOHAq7ZEhRQJUqPTVPOuTyb476#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPJ19afQPeSMtr3O9L1fe5+bNzTAsOOCA5fLihUdryDYc29KKD+0XABHKIvqeefcCsIBjZRA//9OzCUftfvXK9A=#012 create=True mode=0644 path=/tmp/ansible.alrrm9sx state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:06.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000a890 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:07.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:07 np0005548918 python3.9[112122]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.alrrm9sx' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:48:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:07 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:07 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:08 np0005548918 python3.9[112277]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.alrrm9sx state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:08.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030004200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:08 np0005548918 systemd[1]: session-44.scope: Deactivated successfully.
Dec  6 04:48:08 np0005548918 systemd[1]: session-44.scope: Consumed 5.441s CPU time.
Dec  6 04:48:08 np0005548918 systemd-logind[800]: Session 44 logged out. Waiting for processes to exit.
Dec  6 04:48:08 np0005548918 systemd-logind[800]: Removed session 44.
Dec  6 04:48:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:09.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:09 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000a890 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004450 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:10.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:11.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:11 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030004200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000a890 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:12.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:12 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:13.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:13 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:14.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c0019c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:15.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:15 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:16.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:17.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:17 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c0019c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:17 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:18 np0005548918 systemd-logind[800]: New session 45 of user zuul.
Dec  6 04:48:18 np0005548918 systemd[1]: Started Session 45 of User zuul.
Dec  6 04:48:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.002000052s ======
Dec  6 04:48:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:18.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Dec  6 04:48:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004540 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:19.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:19 np0005548918 python3.9[112468]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:48:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:19 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c0019c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:20 np0005548918 python3.9[112626]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  6 04:48:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:20.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c0019c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:21.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:21 np0005548918 python3.9[112781]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:48:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:21 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004560 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:22.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:22 np0005548918 python3.9[112935]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:48:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c0019c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:22 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:23.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:23 np0005548918 python3.9[113089]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:48:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:23 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004580 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:24 np0005548918 python3.9[113242]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:24.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004580 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:24 np0005548918 systemd[1]: session-45.scope: Deactivated successfully.
Dec  6 04:48:24 np0005548918 systemd[1]: session-45.scope: Consumed 3.919s CPU time.
Dec  6 04:48:24 np0005548918 systemd-logind[800]: Session 45 logged out. Waiting for processes to exit.
Dec  6 04:48:24 np0005548918 systemd-logind[800]: Removed session 45.
Dec  6 04:48:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:25.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:25 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:26.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:27.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:27 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580045a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:28.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:29.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:29 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:30 np0005548918 systemd-logind[800]: New session 46 of user zuul.
Dec  6 04:48:30 np0005548918 systemd[1]: Started Session 46 of User zuul.
Dec  6 04:48:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:30.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:31.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:31 np0005548918 python3.9[113451]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:48:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:31 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:32 np0005548918 python3.9[113609]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:48:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:32.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580045e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:32 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:33.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:33 np0005548918 python3.9[113693]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  6 04:48:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:33 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:34.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:35.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:35 np0005548918 python3.9[113847]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:48:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:35 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:36.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:36 np0005548918 python3.9[114001]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  6 04:48:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:36 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:48:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:37.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:37 np0005548918 python3.9[114153]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:48:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:37 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:37 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:38 np0005548918 python3.9[114304]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:48:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:38.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:38 np0005548918 systemd[1]: session-46.scope: Deactivated successfully.
Dec  6 04:48:38 np0005548918 systemd[1]: session-46.scope: Consumed 5.975s CPU time.
Dec  6 04:48:38 np0005548918 systemd-logind[800]: Session 46 logged out. Waiting for processes to exit.
Dec  6 04:48:38 np0005548918 systemd-logind[800]: Removed session 46.
Dec  6 04:48:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:39.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:39 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:40.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:41.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:41 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:42.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:42 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:43.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:43 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:44 np0005548918 systemd-logind[800]: New session 47 of user zuul.
Dec  6 04:48:44 np0005548918 systemd[1]: Started Session 47 of User zuul.
Dec  6 04:48:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:44.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:45.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:45 np0005548918 python3.9[114490]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:48:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:45 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:46.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:46 np0005548918 python3.9[114673]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:48:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:47.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:47 np0005548918 python3.9[114826]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:48:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:47 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:47 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:48 np0005548918 python3.9[114979]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:48.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:49 np0005548918 python3.9[115102]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014527.772917-157-227175418423868/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=d5d7235f1ec552440fcdbbdebdb9f2626d8d05fb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:49.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:49 np0005548918 python3.9[115255]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:49 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:50 np0005548918 python3.9[115379]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014529.2009106-157-170964464239760/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=f805cc6455e59702aa77bd6ffe81bb9b155b0be7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:48:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:50.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:48:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:50 np0005548918 python3.9[115531]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:51.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:51 np0005548918 python3.9[115655]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014530.3959033-157-69240977776649/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=b2908e3c81d5ce2c96f26a366a472f93f3723a26 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:51 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:52 np0005548918 python3.9[115808]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:48:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:52 np0005548918 python3.9[116041]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:48:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:52.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:52 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:53.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:48:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:48:53 np0005548918 python3.9[116194]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:53 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:53 np0005548918 python3.9[116317]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014532.9479964-332-249619028450231/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=07d1283c96a5b3b26441a8278d4aff5866e5e883 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:54 np0005548918 python3.9[116470]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:54.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580047a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:54 np0005548918 python3.9[116593]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014534.0785396-332-188231017003692/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=72139a22070e52361b83b34c98df3f4b6e2a8fd5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:55.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:55 np0005548918 python3.9[116746]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:55 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:56 np0005548918 python3.9[116870]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014535.1151645-332-3813759801066/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=9ac58f4396c9150f49242eb43e335a2b2b6ba116 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:56.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:56 np0005548918 python3.9[117022]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:48:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:57.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:57 np0005548918 python3.9[117175]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:48:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:57 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:57 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:48:58 np0005548918 python3.9[117353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:58 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:58 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:48:58 np0005548918 python3.9[117476]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014537.6954563-506-133744835495212/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=58011a6ee2230ebc9c7c79d1b22b1491b465dc47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:58.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:48:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:48:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:59.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:48:59 np0005548918 python3.9[117628]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:48:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:48:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:48:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:48:59 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:48:59 np0005548918 python3.9[117752]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014538.7413204-506-158148258928866/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=72139a22070e52361b83b34c98df3f4b6e2a8fd5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:48:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:48:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:00 np0005548918 python3.9[117905]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:00.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:01.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:01 np0005548918 python3.9[118028]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014540.1357446-506-229587876806246/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=9000f4eac2ceee4fb66e7225b24c6b44245d1c60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:01 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:02 np0005548918 python3.9[118184]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:02.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:02 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:03.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:03 np0005548918 python3.9[118336]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:03 np0005548918 python3.9[118460]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014542.6507404-716-13830102938617/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:03 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:04 np0005548918 python3.9[118613]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:04.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:05 np0005548918 python3.9[118765]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:05.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:05 np0005548918 python3.9[118889]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014544.6144192-791-154982170190125/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:05 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:06 np0005548918 python3.9[119042]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:06.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:07 np0005548918 python3.9[119222]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:07.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:07 np0005548918 python3.9[119346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014546.575022-863-41449173655982/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:07 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:07 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:08 np0005548918 python3.9[119499]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:08.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:09.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:09 np0005548918 python3.9[119652]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:09 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:09 np0005548918 python3.9[119775]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014548.7930994-940-257276593867012/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:10 np0005548918 python3.9[119928]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:10.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:49:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:11.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:49:11 np0005548918 python3.9[120081]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:11 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:11 np0005548918 python3.9[120204]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014550.8033996-1014-41562864344730/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:12 np0005548918 python3.9[120357]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:12.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:12 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:13.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:13 np0005548918 python3.9[120509]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:13 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:13 np0005548918 python3.9[120633]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014552.7190711-1083-38278729314253/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:14 np0005548918 systemd[1]: session-47.scope: Deactivated successfully.
Dec  6 04:49:14 np0005548918 systemd[1]: session-47.scope: Consumed 23.144s CPU time.
Dec  6 04:49:14 np0005548918 systemd-logind[800]: Session 47 logged out. Waiting for processes to exit.
Dec  6 04:49:14 np0005548918 systemd-logind[800]: Removed session 47.
Dec  6 04:49:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:14.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:15.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:15 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.218927) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556218960, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1083, "num_deletes": 251, "total_data_size": 2728939, "memory_usage": 2771712, "flush_reason": "Manual Compaction"}
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556242578, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 1766352, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12238, "largest_seqno": 13316, "table_properties": {"data_size": 1761486, "index_size": 2454, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10182, "raw_average_key_size": 19, "raw_value_size": 1751812, "raw_average_value_size": 3299, "num_data_blocks": 109, "num_entries": 531, "num_filter_entries": 531, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014470, "oldest_key_time": 1765014470, "file_creation_time": 1765014556, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 23712 microseconds, and 4493 cpu microseconds.
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.242635) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 1766352 bytes OK
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.242653) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.243903) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.243916) EVENT_LOG_v1 {"time_micros": 1765014556243912, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.243931) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 2723663, prev total WAL file size 2723663, number of live WAL files 2.
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.244647) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1724KB)], [24(13MB)]
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556244733, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 15982032, "oldest_snapshot_seqno": -1}
Dec  6 04:49:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4293 keys, 14024878 bytes, temperature: kUnknown
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556394417, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 14024878, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13992883, "index_size": 20173, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 109870, "raw_average_key_size": 25, "raw_value_size": 13911162, "raw_average_value_size": 3240, "num_data_blocks": 852, "num_entries": 4293, "num_filter_entries": 4293, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765014556, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.394643) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 14024878 bytes
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.396380) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 106.7 rd, 93.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.6 +0.0 blob) out(13.4 +0.0 blob), read-write-amplify(17.0) write-amplify(7.9) OK, records in: 4811, records dropped: 518 output_compression: NoCompression
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.396402) EVENT_LOG_v1 {"time_micros": 1765014556396392, "job": 12, "event": "compaction_finished", "compaction_time_micros": 149750, "compaction_time_cpu_micros": 36350, "output_level": 6, "num_output_files": 1, "total_output_size": 14024878, "num_input_records": 4811, "num_output_records": 4293, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556396770, "job": 12, "event": "table_file_deletion", "file_number": 26}
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556399282, "job": 12, "event": "table_file_deletion", "file_number": 24}
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.244530) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.399308) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.399312) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.399314) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.399315) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:49:16 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:49:16.399316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:49:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:16.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:17.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:17 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:17 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094918 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:49:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:18.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:19.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:19 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:20 np0005548918 systemd-logind[800]: New session 48 of user zuul.
Dec  6 04:49:20 np0005548918 systemd[1]: Started Session 48 of User zuul.
Dec  6 04:49:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:20.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:20 np0005548918 python3.9[120820]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:21.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:21 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:21 np0005548918 python3.9[120973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:22 np0005548918 python3.9[121097]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014561.1728442-64-243282356837795/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=944de880f37676f80f6e04a4864888bf3f7decbf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:22.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:22 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:23.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:23 np0005548918 python3.9[121249]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:23 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:23 np0005548918 python3.9[121373]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014562.7581544-64-31076143229341/.source.conf _original_basename=ceph.conf follow=False checksum=531c84d7e2c99e4f6cf7d56dd7b16abeaf31bfa1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:24 np0005548918 systemd[1]: session-48.scope: Deactivated successfully.
Dec  6 04:49:24 np0005548918 systemd[1]: session-48.scope: Consumed 2.780s CPU time.
Dec  6 04:49:24 np0005548918 systemd-logind[800]: Session 48 logged out. Waiting for processes to exit.
Dec  6 04:49:24 np0005548918 systemd-logind[800]: Removed session 48.
Dec  6 04:49:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:24.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:25.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:25 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:26.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:27.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:27 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:49:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:27 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:28.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:29.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:29 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:30 np0005548918 systemd-logind[800]: New session 49 of user zuul.
Dec  6 04:49:30 np0005548918 systemd[1]: Started Session 49 of User zuul.
Dec  6 04:49:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:49:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:49:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:30.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:31.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:31 np0005548918 python3.9[121584]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:49:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:31 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:32 np0005548918 python3.9[121742]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:32.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:32 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:33 np0005548918 python3.9[121894]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:49:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:33.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:33 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:49:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:33 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c0010b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:33 np0005548918 python3.9[122045]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:49:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:34 np0005548918 python3.9[122198]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  6 04:49:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:34.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:35.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:35 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c0010d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:36 np0005548918 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec  6 04:49:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:36.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:37 np0005548918 python3.9[122356]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:49:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:37.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:37 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:37 np0005548918 python3.9[122441]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:49:37 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000ac80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:38.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:39.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:39 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580049d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/094940 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:49:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:40 np0005548918 python3.9[122598]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:49:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000ac80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:40.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:41.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:41 np0005548918 python3[122755]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec  6 04:49:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:41 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580049d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:42 np0005548918 python3.9[122908]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580049d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:42.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:42 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:43.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:43 np0005548918 python3.9[123061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:43 np0005548918 python3.9[123139]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:43 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000ac80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:44 np0005548918 python3.9[123292]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:44.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:44 np0005548918 python3.9[123370]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.t7l5ebeg recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:45.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:45 np0005548918 python3.9[123523]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:45 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580049d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:46 np0005548918 python3.9[123602]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000ac80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:46.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:47 np0005548918 python3.9[123779]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:49:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:47.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:47 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:47 np0005548918 python3[123933]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  6 04:49:47 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580049f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:48 np0005548918 python3.9[124086]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000ac80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:48.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:49.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:49 np0005548918 python3.9[124212]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014588.2525194-434-105109996405836/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:49 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:50 np0005548918 python3.9[124365]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:50 np0005548918 python3.9[124490]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014589.6649246-478-93930368602595/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:50.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:49:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:51.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:49:51 np0005548918 python3.9[124644]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:51 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000ac80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:52 np0005548918 python3.9[124770]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014591.0549746-524-71064660601361/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:52 np0005548918 python3.9[124922]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:52.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:52 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:49:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:53.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:49:53 np0005548918 python3.9[125048]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014592.326636-568-16547946041853/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:53 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:54 np0005548918 python3.9[125201]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:49:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000ac80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:54.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:54 np0005548918 python3.9[125328]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014593.657923-613-78974965230204/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:55.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:55 np0005548918 python3.9[125481]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:55 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:56 np0005548918 python3.9[125634]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:49:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004a30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000ac80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:49:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:56.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:49:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:57.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:57 np0005548918 python3.9[125790]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:49:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:57 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:57 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:49:58 np0005548918 python3.9[125993]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:49:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004a50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:49:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:58.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:49:58 np0005548918 python3.9[126235]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:49:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:49:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:49:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:59.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:49:59 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:49:59 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:49:59 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:49:59 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:49:59 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:49:59 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:49:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:49:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:49:59 np0005548918 python3.9[126402]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:49:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:49:59 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000ac80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:49:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:49:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:00 np0005548918 ceph-mon[75798]: overall HEALTH_OK
Dec  6 04:50:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:00 np0005548918 python3.9[126558]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:00.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:01.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:01 np0005548918 python3.9[126709]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:50:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:01 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004a70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000ac80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:02.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:02 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:03 np0005548918 python3.9[126863]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:f2:93:49:d5" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:50:03 np0005548918 ovs-vsctl[126865]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:f2:93:49:d5 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec  6 04:50:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:03.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:03 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:03 np0005548918 python3.9[127017]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:50:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004a90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:04 np0005548918 python3.9[127198]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:50:04 np0005548918 ovs-vsctl[127199]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec  6 04:50:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000ac80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:04.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:05 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:50:05 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:50:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:05.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:05 np0005548918 python3.9[127350]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:50:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:05 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:06 np0005548918 python3.9[127505]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:06 np0005548918 python3.9[127657]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004ab0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:06.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:07.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:07 np0005548918 python3.9[127760]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:07 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000ac80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:07 np0005548918 python3.9[127913]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:07 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:08 np0005548918 python3.9[127992]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000ac80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000ac80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:50:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:08.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:50:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:09 np0005548918 python3.9[128146]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:09.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:09 np0005548918 python3.9[128299]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:09 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:10 np0005548918 python3.9[128378]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:10 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 04:50:10 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.5 total, 600.0 interval#012Cumulative writes: 2259 writes, 13K keys, 2259 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s#012Cumulative WAL: 2259 writes, 2259 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2259 writes, 13K keys, 2259 commit groups, 1.0 writes per commit group, ingest: 38.38 MB, 0.06 MB/s#012Interval WAL: 2259 writes, 2259 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     30.7      0.72              0.06         6    0.120       0      0       0.0       0.0#012  L6      1/0   13.38 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.0     89.1     78.7      0.83              0.18         5    0.167     21K   2283       0.0       0.0#012 Sum      1/0   13.38 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.0     47.9     56.5      1.55              0.23        11    0.141     21K   2283       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.0     62.3     73.5      1.19              0.23        10    0.119     21K   2283       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0     89.1     78.7      0.83              0.18         5    0.167     21K   2283       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     61.5      0.36              0.06         5    0.072       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.36              0.00         1    0.360       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.5 total, 600.0 interval#012Flush(GB): cumulative 0.022, interval 0.022#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.09 GB write, 0.15 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.6 seconds#012Interval compaction: 0.09 GB write, 0.15 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.2 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55784c777350#2 capacity: 304.00 MB usage: 1.62 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 8.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(91,1.41 MB,0.465117%) FilterBlock(11,71.42 KB,0.0229434%) IndexBlock(11,143.02 KB,0.045942%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  6 04:50:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:10.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:11 np0005548918 python3.9[128530]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:11.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:11 np0005548918 python3.9[128609]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:11 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:12 np0005548918 python3.9[128762]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:50:12 np0005548918 systemd[1]: Reloading.
Dec  6 04:50:12 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:50:12 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:50:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:12.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:12 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:13.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:13 np0005548918 python3.9[128954]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:13 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:14 np0005548918 python3.9[129033]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004af0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:14.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:14 np0005548918 python3.9[129185]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:15.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:15 np0005548918 python3.9[129264]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:15 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:16 np0005548918 python3.9[129417]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:50:16 np0005548918 systemd[1]: Reloading.
Dec  6 04:50:16 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:50:16 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:50:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:16 np0005548918 systemd[1]: Starting Create netns directory...
Dec  6 04:50:16 np0005548918 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  6 04:50:16 np0005548918 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  6 04:50:16 np0005548918 systemd[1]: Finished Create netns directory.
Dec  6 04:50:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:16.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:17.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:17 np0005548918 python3.9[129612]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:17 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:17 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:17 np0005548918 python3.9[129765]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095018 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:50:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:18 np0005548918 python3.9[129888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014617.5769098-1366-151153727016436/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:18.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:19.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:19 np0005548918 python3.9[130041]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:19 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:20 np0005548918 python3.9[130194]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:20 np0005548918 python3.9[130317]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014619.736215-1441-81130139354306/.source.json _original_basename=.alhlnc1f follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:20.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:21.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:21 np0005548918 python3.9[130470]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:21 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:22.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:22 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:23.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:23 np0005548918 python3.9[130899]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec  6 04:50:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:23 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030002110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:24 np0005548918 python3.9[131052]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  6 04:50:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:24.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:25.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:25 np0005548918 python3.9[131205]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  6 04:50:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:25 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:26.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:27 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:50:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:27.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:27 np0005548918 python3[131409]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  6 04:50:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:27 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:28.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:29.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:29 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:50:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:50:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:50:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004b50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c160 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:50:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:30.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:50:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:31.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:31 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:32 np0005548918 podman[131422]: 2025-12-06 09:50:32.504667895 +0000 UTC m=+4.770050952 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c
Dec  6 04:50:32 np0005548918 podman[131548]: 2025-12-06 09:50:32.633713672 +0000 UTC m=+0.047481616 container create 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  6 04:50:32 np0005548918 podman[131548]: 2025-12-06 09:50:32.608118021 +0000 UTC m=+0.021885995 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c
Dec  6 04:50:32 np0005548918 python3[131409]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c
Dec  6 04:50:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030002110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:32.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:32 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:33.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:33 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:50:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:33 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:33 np0005548918 python3.9[131739]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:50:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c1a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:34.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:34 np0005548918 python3.9[131894]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:35.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:35 np0005548918 python3.9[131971]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:50:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:35 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030002110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:36 np0005548918 python3.9[132123]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014635.6386223-1705-12935713778607/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:50:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:36 np0005548918 python3.9[132199]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:50:36 np0005548918 systemd[1]: Reloading.
Dec  6 04:50:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004bb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:36 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:50:36 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:50:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:36.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:37.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:37 np0005548918 python3.9[132312]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:50:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:37 np0005548918 systemd[1]: Reloading.
Dec  6 04:50:37 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:50:37 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:50:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:37 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c1c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:37 np0005548918 systemd[1]: Starting ovn_controller container...
Dec  6 04:50:37 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:38 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:50:38 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1109ced00fc510e2752871a338cd490540a452fc22b1f1d9a61ac0e0d70019bc/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec  6 04:50:38 np0005548918 systemd[1]: Started /usr/bin/podman healthcheck run 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20.
Dec  6 04:50:38 np0005548918 podman[132355]: 2025-12-06 09:50:38.137766355 +0000 UTC m=+0.133963170 container init 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: + sudo -E kolla_set_configs
Dec  6 04:50:38 np0005548918 podman[132355]: 2025-12-06 09:50:38.17474716 +0000 UTC m=+0.170943965 container start 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 04:50:38 np0005548918 edpm-start-podman-container[132355]: ovn_controller
Dec  6 04:50:38 np0005548918 systemd[1]: Created slice User Slice of UID 0.
Dec  6 04:50:38 np0005548918 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec  6 04:50:38 np0005548918 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec  6 04:50:38 np0005548918 systemd[1]: Starting User Manager for UID 0...
Dec  6 04:50:38 np0005548918 edpm-start-podman-container[132354]: Creating additional drop-in dependency for "ovn_controller" (0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20)
Dec  6 04:50:38 np0005548918 systemd[1]: Reloading.
Dec  6 04:50:38 np0005548918 podman[132378]: 2025-12-06 09:50:38.272953216 +0000 UTC m=+0.088542570 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 04:50:38 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:50:38 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:50:38 np0005548918 systemd[132403]: Queued start job for default target Main User Target.
Dec  6 04:50:38 np0005548918 systemd[132403]: Created slice User Application Slice.
Dec  6 04:50:38 np0005548918 systemd[132403]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec  6 04:50:38 np0005548918 systemd[132403]: Started Daily Cleanup of User's Temporary Directories.
Dec  6 04:50:38 np0005548918 systemd[132403]: Reached target Paths.
Dec  6 04:50:38 np0005548918 systemd[132403]: Reached target Timers.
Dec  6 04:50:38 np0005548918 systemd[132403]: Starting D-Bus User Message Bus Socket...
Dec  6 04:50:38 np0005548918 systemd[132403]: Starting Create User's Volatile Files and Directories...
Dec  6 04:50:38 np0005548918 systemd[132403]: Finished Create User's Volatile Files and Directories.
Dec  6 04:50:38 np0005548918 systemd[132403]: Listening on D-Bus User Message Bus Socket.
Dec  6 04:50:38 np0005548918 systemd[132403]: Reached target Sockets.
Dec  6 04:50:38 np0005548918 systemd[132403]: Reached target Basic System.
Dec  6 04:50:38 np0005548918 systemd[132403]: Reached target Main User Target.
Dec  6 04:50:38 np0005548918 systemd[132403]: Startup finished in 145ms.
Dec  6 04:50:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c1c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:38 np0005548918 systemd[1]: Started User Manager for UID 0.
Dec  6 04:50:38 np0005548918 systemd[1]: Started ovn_controller container.
Dec  6 04:50:38 np0005548918 systemd[1]: 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20-36074a73904539d2.service: Main process exited, code=exited, status=1/FAILURE
Dec  6 04:50:38 np0005548918 systemd[1]: 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20-36074a73904539d2.service: Failed with result 'exit-code'.
Dec  6 04:50:38 np0005548918 systemd[1]: Started Session c1 of User root.
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: INFO:__main__:Validating config file
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: INFO:__main__:Writing out command to execute
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: ++ cat /run_command
Dec  6 04:50:38 np0005548918 systemd[1]: session-c1.scope: Deactivated successfully.
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: + ARGS=
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: + sudo kolla_copy_cacerts
Dec  6 04:50:38 np0005548918 systemd[1]: Started Session c2 of User root.
Dec  6 04:50:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: + [[ ! -n '' ]]
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: + . kolla_extend_start
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: + umask 0022
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec  6 04:50:38 np0005548918 systemd[1]: session-c2.scope: Deactivated successfully.
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec  6 04:50:38 np0005548918 NetworkManager[48884]: <info>  [1765014638.7275] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Dec  6 04:50:38 np0005548918 NetworkManager[48884]: <info>  [1765014638.7283] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 04:50:38 np0005548918 NetworkManager[48884]: <info>  [1765014638.7292] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec  6 04:50:38 np0005548918 NetworkManager[48884]: <info>  [1765014638.7296] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Dec  6 04:50:38 np0005548918 NetworkManager[48884]: <info>  [1765014638.7299] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00010|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00011|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00013|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Dec  6 04:50:38 np0005548918 kernel: br-int: entered promiscuous mode
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00017|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00018|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00019|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00020|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00021|features|INFO|OVS Feature: ct_flush, state: supported
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00022|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec  6 04:50:38 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:38Z|00023|main|INFO|OVS feature set changed, force recompute.
Dec  6 04:50:38 np0005548918 systemd-udevd[132499]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 04:50:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:38.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:39.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:39 np0005548918 python3.9[132630]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:50:39 np0005548918 ovs-vsctl[132631]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec  6 04:50:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:39 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:39Z|00024|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  6 04:50:39 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:39Z|00025|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  6 04:50:39 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:39Z|00026|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  6 04:50:39 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:39Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  6 04:50:39 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:39Z|00028|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec  6 04:50:39 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:39Z|00029|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec  6 04:50:39 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:39Z|00030|main|INFO|OVS feature set changed, force recompute.
Dec  6 04:50:39 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:39Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  6 04:50:39 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:39Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  6 04:50:39 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:39Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  6 04:50:39 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:39Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  6 04:50:39 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:39Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  6 04:50:39 np0005548918 ovn_controller[132371]: 2025-12-06T09:50:39Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  6 04:50:39 np0005548918 NetworkManager[48884]: <info>  [1765014639.7419] manager: (ovn-61eba4-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec  6 04:50:39 np0005548918 NetworkManager[48884]: <info>  [1765014639.7432] manager: (ovn-d39b5b-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Dec  6 04:50:39 np0005548918 NetworkManager[48884]: <info>  [1765014639.7442] manager: (ovn-127282-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Dec  6 04:50:39 np0005548918 kernel: genev_sys_6081: entered promiscuous mode
Dec  6 04:50:39 np0005548918 NetworkManager[48884]: <info>  [1765014639.7697] device (genev_sys_6081): carrier: link connected
Dec  6 04:50:39 np0005548918 NetworkManager[48884]: <info>  [1765014639.7701] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Dec  6 04:50:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:39 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:40 np0005548918 python3.9[132788]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:50:40 np0005548918 ovs-vsctl[132790]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec  6 04:50:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095040 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:50:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c1c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:40.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:41.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:41 np0005548918 python3.9[132944]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:50:41 np0005548918 ovs-vsctl[132945]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec  6 04:50:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:41 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:41 np0005548918 systemd[1]: session-49.scope: Deactivated successfully.
Dec  6 04:50:41 np0005548918 systemd[1]: session-49.scope: Consumed 56.522s CPU time.
Dec  6 04:50:41 np0005548918 systemd-logind[800]: Session 49 logged out. Waiting for processes to exit.
Dec  6 04:50:41 np0005548918 systemd-logind[800]: Removed session 49.
Dec  6 04:50:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c1e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:42.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:42 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:43.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:43 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:44.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:45.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:45 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80600089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:46.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:47.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:47 np0005548918 systemd-logind[800]: New session 51 of user zuul.
Dec  6 04:50:47 np0005548918 systemd[1]: Started Session 51 of User zuul.
Dec  6 04:50:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:47 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058004bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:47 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c220 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:48 np0005548918 systemd[1]: Stopping User Manager for UID 0...
Dec  6 04:50:48 np0005548918 systemd[132403]: Activating special unit Exit the Session...
Dec  6 04:50:48 np0005548918 systemd[132403]: Stopped target Main User Target.
Dec  6 04:50:48 np0005548918 systemd[132403]: Stopped target Basic System.
Dec  6 04:50:48 np0005548918 systemd[132403]: Stopped target Paths.
Dec  6 04:50:48 np0005548918 systemd[132403]: Stopped target Sockets.
Dec  6 04:50:48 np0005548918 systemd[132403]: Stopped target Timers.
Dec  6 04:50:48 np0005548918 systemd[132403]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  6 04:50:48 np0005548918 systemd[132403]: Closed D-Bus User Message Bus Socket.
Dec  6 04:50:48 np0005548918 systemd[132403]: Stopped Create User's Volatile Files and Directories.
Dec  6 04:50:48 np0005548918 systemd[132403]: Removed slice User Application Slice.
Dec  6 04:50:48 np0005548918 systemd[132403]: Reached target Shutdown.
Dec  6 04:50:48 np0005548918 systemd[132403]: Finished Exit the Session.
Dec  6 04:50:48 np0005548918 systemd[132403]: Reached target Exit the Session.
Dec  6 04:50:48 np0005548918 systemd[1]: user@0.service: Deactivated successfully.
Dec  6 04:50:48 np0005548918 systemd[1]: Stopped User Manager for UID 0.
Dec  6 04:50:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:48 np0005548918 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec  6 04:50:48 np0005548918 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec  6 04:50:48 np0005548918 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec  6 04:50:48 np0005548918 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec  6 04:50:48 np0005548918 systemd[1]: Removed slice User Slice of UID 0.
Dec  6 04:50:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:48.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:49 np0005548918 python3.9[133155]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:50:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:49.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:49 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:50 np0005548918 python3.9[133317]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:50.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:50 np0005548918 python3.9[133469]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:51.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:51 np0005548918 python3.9[133622]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:51 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:52 np0005548918 python3.9[133775]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:52 np0005548918 python3.9[133927]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:52.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:52 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:53.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:53 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:54 np0005548918 python3.9[134079]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:50:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:54.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:55.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:55 np0005548918 python3.9[134232]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  6 04:50:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:55 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:56.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:57.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:57 np0005548918 python3.9[134384]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:57 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:57 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:50:58 np0005548918 python3.9[134507]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014656.5894802-221-247523363691263/.source follow=False _original_basename=haproxy.j2 checksum=cc5e97ea900947bff0c19d73b88d99840e041f49 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:58 np0005548918 python3.9[134657]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:50:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:50:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:58.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:50:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:50:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:50:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:59.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:50:59 np0005548918 python3.9[134779]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014658.3030226-265-217489761380860/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:50:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:50:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:50:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:50:59 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:50:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:50:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:00 np0005548918 python3.9[134933]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:51:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:00.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:01.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:01 np0005548918 python3.9[135018]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:51:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:01 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 04:51:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5570 writes, 24K keys, 5570 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5570 writes, 875 syncs, 6.37 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5570 writes, 24K keys, 5570 commit groups, 1.0 writes per commit group, ingest: 19.09 MB, 0.03 MB/s#012Interval WAL: 5570 writes, 875 syncs, 6.37 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f8c486a9b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f8c486a9b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Dec  6 04:51:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:02.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:02 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:03.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:03 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:04.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:05.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:05 np0005548918 python3.9[135258]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:51:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:05 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c390 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:05 np0005548918 python3.9[135413]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:05 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:51:05 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:51:05 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:51:05 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:51:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:06 np0005548918 python3.9[135534]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014665.5350971-377-6012689792037/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.002000053s ======
Dec  6 04:51:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:06.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Dec  6 04:51:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:07 np0005548918 python3.9[135684]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:07.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:07 np0005548918 python3.9[135831]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014666.6075864-377-117255913202007/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:07 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:07 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c3b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:08.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:09 np0005548918 ovn_controller[132371]: 2025-12-06T09:51:09Z|00031|memory|INFO|16128 kB peak resident set size after 30.5 seconds
Dec  6 04:51:09 np0005548918 ovn_controller[132371]: 2025-12-06T09:51:09Z|00032|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Dec  6 04:51:09 np0005548918 podman[135956]: 2025-12-06 09:51:09.199904612 +0000 UTC m=+0.085366364 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 04:51:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:09.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:09 np0005548918 python3.9[135998]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:09 np0005548918 python3.9[136132]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014668.8877947-508-54989251661056/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:09 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:10 np0005548918 python3.9[136283]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c3d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:10.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:10 np0005548918 python3.9[136429]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014669.9881895-508-78095769954306/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:11.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:11 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:51:11 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:51:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:11 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:12 np0005548918 python3.9[136581]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:51:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:12 np0005548918 python3.9[136735]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:12.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:12 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:13.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:13 np0005548918 python3.9[136888]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:13 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c3f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:14 np0005548918 python3.9[136967]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c3f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:14 np0005548918 python3.9[137119]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c3f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:14.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:15 np0005548918 python3.9[137197]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:15.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:15 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:16 np0005548918 python3.9[137351]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c3f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:16.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:17 np0005548918 python3.9[137503]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:17.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:17 np0005548918 python3.9[137582]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:17 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:17 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:18 np0005548918 python3.9[137735]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:18.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:18 np0005548918 python3.9[137813]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:19.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:19 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:19 np0005548918 python3.9[137966]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:51:19 np0005548918 systemd[1]: Reloading.
Dec  6 04:51:20 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:51:20 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:51:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:20.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:21 np0005548918 python3.9[138157]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:21.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:21 np0005548918 python3.9[138236]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:21 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:22 np0005548918 python3.9[138389]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:22 np0005548918 python3.9[138467]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:22.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:23 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:23.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:23 np0005548918 python3.9[138620]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:51:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:23 np0005548918 systemd[1]: Reloading.
Dec  6 04:51:23 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:51:23 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:51:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:23 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:24 np0005548918 systemd[1]: Starting Create netns directory...
Dec  6 04:51:24 np0005548918 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  6 04:51:24 np0005548918 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  6 04:51:24 np0005548918 systemd[1]: Finished Create netns directory.
Dec  6 04:51:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c450 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:24 np0005548918 python3.9[138815]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:24.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:25.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:25 np0005548918 python3.9[138968]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:25 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:26 np0005548918 python3.9[139092]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014685.1535194-962-148430959971986/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:26.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:27 np0005548918 python3.9[139244]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:51:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:27.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:27 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:27 np0005548918 python3.9[139422]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:51:28 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:28 np0005548918 python3.9[139546]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014687.484563-1036-235469061451924/.source.json _original_basename=.t26dml2j follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c490 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:28.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:29 np0005548918 python3.9[139698]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:29.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:29 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060009cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:30.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:31.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:31 np0005548918 python3.9[140130]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec  6 04:51:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:31 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c4b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002170 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:32 np0005548918 python3.9[140283]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  6 04:51:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:32.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:33.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:33 np0005548918 python3.9[140436]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  6 04:51:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:33 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c4b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002170 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:34.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:35.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:35 np0005548918 python3[140615]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  6 04:51:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:35 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c4d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:36.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:37.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:37 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c4f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:38.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:39.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:39 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:40.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:41.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:41 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c510 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:42 np0005548918 podman[140697]: 2025-12-06 09:51:42.378589693 +0000 UTC m=+2.252136507 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Dec  6 04:51:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:42.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:43.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:43 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:43 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:44.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:45.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:45 np0005548918 podman[140628]: 2025-12-06 09:51:45.693077901 +0000 UTC m=+10.142746713 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3
Dec  6 04:51:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:45 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:45 np0005548918 podman[140789]: 2025-12-06 09:51:45.819561204 +0000 UTC m=+0.027528558 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3
Dec  6 04:51:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:46 np0005548918 podman[140789]: 2025-12-06 09:51:46.106034666 +0000 UTC m=+0.314002040 container create ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 04:51:46 np0005548918 python3[140615]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3
Dec  6 04:51:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004b50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:46.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:47.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:47 np0005548918 python3.9[141006]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:51:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:47 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:48 np0005548918 python3.9[141161]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:48 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:48.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:49 np0005548918 python3.9[141237]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:51:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:51:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:49.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:51:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:49 np0005548918 python3.9[141389]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014709.1354082-1300-219456434515147/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:51:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:49 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004b50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:50 np0005548918 python3.9[141466]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:51:50 np0005548918 systemd[1]: Reloading.
Dec  6 04:51:50 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:51:50 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:51:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c570 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:50.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:51 np0005548918 python3.9[141577]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:51:51 np0005548918 systemd[1]: Reloading.
Dec  6 04:51:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:51.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:51 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:51:51 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:51:51 np0005548918 systemd[1]: Starting ovn_metadata_agent container...
Dec  6 04:51:51 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:51:51 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f5de78fc063b0965daae261c7c411e9502afdc52e6413383b8aefb5ce18aaa1/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec  6 04:51:51 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f5de78fc063b0965daae261c7c411e9502afdc52e6413383b8aefb5ce18aaa1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  6 04:51:51 np0005548918 systemd[1]: Started /usr/bin/podman healthcheck run ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6.
Dec  6 04:51:51 np0005548918 podman[141620]: 2025-12-06 09:51:51.707858002 +0000 UTC m=+0.118456280 container init ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  6 04:51:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: + sudo -E kolla_set_configs
Dec  6 04:51:51 np0005548918 podman[141620]: 2025-12-06 09:51:51.741632086 +0000 UTC m=+0.152230314 container start ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec  6 04:51:51 np0005548918 edpm-start-podman-container[141620]: ovn_metadata_agent
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Validating config file
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Copying service configuration files
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Writing out command to execute
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Setting permission for /var/lib/neutron
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: ++ cat /run_command
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: + CMD=neutron-ovn-metadata-agent
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: + ARGS=
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: + sudo kolla_copy_cacerts
Dec  6 04:51:51 np0005548918 edpm-start-podman-container[141619]: Creating additional drop-in dependency for "ovn_metadata_agent" (ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6)
Dec  6 04:51:51 np0005548918 podman[141642]: 2025-12-06 09:51:51.813990441 +0000 UTC m=+0.062066152 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: + [[ ! -n '' ]]
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: + . kolla_extend_start
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: Running command: 'neutron-ovn-metadata-agent'
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: + umask 0022
Dec  6 04:51:51 np0005548918 ovn_metadata_agent[141635]: + exec neutron-ovn-metadata-agent
Dec  6 04:51:51 np0005548918 systemd[1]: Reloading.
Dec  6 04:51:51 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:51:51 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:51:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:51 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:52 np0005548918 systemd[1]: Started ovn_metadata_agent container.
Dec  6 04:51:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004b70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c590 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:52 np0005548918 systemd-logind[800]: Session 51 logged out. Waiting for processes to exit.
Dec  6 04:51:52 np0005548918 systemd[1]: session-51.scope: Deactivated successfully.
Dec  6 04:51:52 np0005548918 systemd[1]: session-51.scope: Consumed 54.910s CPU time.
Dec  6 04:51:52 np0005548918 systemd-logind[800]: Removed session 51.
Dec  6 04:51:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:52.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:53.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.602 141640 INFO neutron.common.config [-] Logging enabled!#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.603 141640 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.603 141640 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.603 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.604 141640 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.604 141640 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.604 141640 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.604 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.604 141640 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.604 141640 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.605 141640 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.605 141640 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.605 141640 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.605 141640 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.605 141640 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.605 141640 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.605 141640 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.606 141640 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.606 141640 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.606 141640 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.606 141640 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.606 141640 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.606 141640 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.606 141640 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.607 141640 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.607 141640 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.607 141640 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.607 141640 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.607 141640 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.607 141640 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.607 141640 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.607 141640 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.608 141640 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.608 141640 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.608 141640 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.608 141640 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.608 141640 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.608 141640 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.608 141640 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.609 141640 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.609 141640 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.609 141640 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.609 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.609 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.609 141640 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.609 141640 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.609 141640 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.609 141640 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.609 141640 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.610 141640 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.610 141640 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.610 141640 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.610 141640 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.610 141640 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.610 141640 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.610 141640 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.611 141640 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.611 141640 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.611 141640 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.611 141640 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.611 141640 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.611 141640 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.611 141640 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.611 141640 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.612 141640 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.612 141640 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.612 141640 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.612 141640 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.612 141640 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.612 141640 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.612 141640 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.613 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.613 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.613 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.613 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.613 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.613 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.613 141640 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.613 141640 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.614 141640 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.614 141640 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.614 141640 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.614 141640 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.614 141640 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.614 141640 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.614 141640 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.615 141640 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.615 141640 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.615 141640 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.615 141640 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.615 141640 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.615 141640 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.615 141640 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.615 141640 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.615 141640 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.616 141640 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.616 141640 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.616 141640 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.616 141640 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.616 141640 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.616 141640 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.616 141640 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.616 141640 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.617 141640 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.617 141640 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.617 141640 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.617 141640 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.617 141640 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.617 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.617 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.618 141640 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.618 141640 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.618 141640 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.618 141640 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.618 141640 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.618 141640 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.618 141640 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.618 141640 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.619 141640 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.619 141640 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.619 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.619 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.619 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.619 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.620 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.620 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.620 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.620 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.620 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.620 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.620 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.620 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.620 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.621 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.621 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.621 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.621 141640 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.621 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.621 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.621 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.622 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.622 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.622 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.622 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.622 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.622 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.622 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.623 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.623 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.623 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.623 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.623 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.623 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.623 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.623 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.624 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.624 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.624 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.624 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.624 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.624 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.624 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.625 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.625 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.625 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.625 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.625 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.625 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.625 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.626 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.626 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.626 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.626 141640 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.626 141640 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.626 141640 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.626 141640 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.627 141640 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.627 141640 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.627 141640 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.627 141640 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.627 141640 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.627 141640 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.627 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.628 141640 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.628 141640 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.628 141640 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.628 141640 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.628 141640 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.628 141640 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.628 141640 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.629 141640 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.629 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.629 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.629 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.629 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.629 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.629 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.630 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.630 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.630 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.630 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.630 141640 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.630 141640 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.630 141640 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.630 141640 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.631 141640 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.631 141640 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.631 141640 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.631 141640 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.631 141640 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.631 141640 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.631 141640 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.631 141640 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.632 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.632 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.632 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.632 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.632 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.632 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.632 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.633 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.633 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.633 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.633 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.633 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.633 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.633 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.634 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.634 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.634 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.634 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.634 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.634 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.634 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.634 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.634 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.634 141640 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.635 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.635 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.635 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.635 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.635 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.635 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.635 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.635 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.636 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.636 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.636 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.636 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.636 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.636 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.636 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.636 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.637 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.637 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.637 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.637 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.637 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.637 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.637 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.637 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.638 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.638 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.638 141640 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.638 141640 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.638 141640 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.638 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.638 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.638 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.639 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.639 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.639 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.639 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.639 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.639 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.639 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.639 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.640 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.640 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.640 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.640 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.640 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.640 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.640 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.641 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.641 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.641 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.641 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.641 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.641 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.641 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.642 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.642 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.642 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.642 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.642 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.642 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.642 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.642 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.643 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.643 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.643 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.643 141640 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.643 141640 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.654 141640 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.654 141640 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.654 141640 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.654 141640 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.655 141640 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.668 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 1b31b208-e0d4-490d-9f30-552f5575d012 (UUID: 1b31b208-e0d4-490d-9f30-552f5575d012) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.692 141640 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.692 141640 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.692 141640 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.692 141640 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.695 141640 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.701 141640 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.707 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '1b31b208-e0d4-490d-9f30-552f5575d012'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], external_ids={}, name=1b31b208-e0d4-490d-9f30-552f5575d012, nb_cfg_timestamp=1765014647742, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.708 141640 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fef907c5f40>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.709 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 04:51:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.709 141640 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.709 141640 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.709 141640 INFO oslo_service.service [-] Starting 1 workers#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.713 141640 DEBUG oslo_service.service [-] Started child 141748 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.716 141748 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-886236'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.717 141640 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp427z_cs5/privsep.sock']#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.735 141748 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.735 141748 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.735 141748 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.738 141748 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.744 141748 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  6 04:51:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:53.749 141748 INFO eventlet.wsgi.server [-] (141748) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Dec  6 04:51:53 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:53 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:54 np0005548918 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec  6 04:51:54 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:54.377 141640 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  6 04:51:54 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:54.379 141640 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp427z_cs5/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  6 04:51:54 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:54.244 141754 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  6 04:51:54 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:54.248 141754 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  6 04:51:54 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:54.250 141754 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Dec  6 04:51:54 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:54.250 141754 INFO oslo.privsep.daemon [-] privsep daemon running as pid 141754#033[00m
Dec  6 04:51:54 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:54.383 141754 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc317c8-1121-4623-bf9b-1852b27eacf8]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 04:51:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:54 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:54.891 141754 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:51:54 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:54.891 141754 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:51:54 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:54.892 141754 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:51:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:51:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:54.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:51:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:55.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.445 141754 DEBUG oslo.privsep.daemon [-] privsep: reply[f628b1ec-c329-46fc-b9fa-ef0d63c75b6f]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.447 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, column=external_ids, values=({'neutron:ovn-metadata-id': 'ed5462b0-2842-50c1-b983-c0231ea33af3'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.472 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.496 141640 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.496 141640 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.496 141640 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.496 141640 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.496 141640 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.496 141640 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.496 141640 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.496 141640 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.496 141640 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.497 141640 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.497 141640 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.497 141640 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.497 141640 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.497 141640 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.497 141640 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.497 141640 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.497 141640 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.497 141640 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.498 141640 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.498 141640 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.498 141640 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.498 141640 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.498 141640 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.498 141640 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.498 141640 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.498 141640 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.499 141640 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.499 141640 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.499 141640 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.499 141640 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.499 141640 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.499 141640 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.499 141640 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.499 141640 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.500 141640 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.500 141640 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.500 141640 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.500 141640 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.500 141640 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.500 141640 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.500 141640 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.500 141640 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.500 141640 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.501 141640 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.501 141640 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.501 141640 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.501 141640 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.501 141640 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.501 141640 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.501 141640 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.501 141640 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.501 141640 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.501 141640 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.502 141640 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.502 141640 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.502 141640 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.502 141640 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.502 141640 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.502 141640 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.502 141640 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.502 141640 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.502 141640 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.503 141640 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.503 141640 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.503 141640 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.503 141640 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.503 141640 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.503 141640 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.503 141640 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.503 141640 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.503 141640 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.503 141640 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.504 141640 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.504 141640 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.504 141640 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.504 141640 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.504 141640 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.504 141640 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.504 141640 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.504 141640 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.504 141640 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.505 141640 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.505 141640 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.505 141640 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.505 141640 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.505 141640 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.505 141640 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.505 141640 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.505 141640 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.505 141640 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.505 141640 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.506 141640 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.506 141640 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.506 141640 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.506 141640 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.506 141640 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.506 141640 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.506 141640 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.506 141640 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.506 141640 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.506 141640 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.507 141640 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.507 141640 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.507 141640 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.507 141640 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.507 141640 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.507 141640 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.507 141640 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.507 141640 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.507 141640 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.508 141640 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.508 141640 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.508 141640 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.508 141640 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.508 141640 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.508 141640 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.508 141640 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.508 141640 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.508 141640 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.509 141640 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.509 141640 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.509 141640 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.509 141640 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.509 141640 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.509 141640 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.509 141640 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.509 141640 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.509 141640 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.510 141640 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.510 141640 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.510 141640 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.510 141640 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.510 141640 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.510 141640 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.510 141640 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.510 141640 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.510 141640 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.511 141640 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.511 141640 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.511 141640 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.511 141640 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.511 141640 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.511 141640 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.511 141640 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.511 141640 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.511 141640 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.512 141640 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.512 141640 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.512 141640 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.512 141640 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.512 141640 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.512 141640 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.512 141640 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.512 141640 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.512 141640 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.512 141640 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.513 141640 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.513 141640 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.513 141640 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.513 141640 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.513 141640 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.513 141640 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.513 141640 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.513 141640 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.513 141640 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.513 141640 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.513 141640 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.514 141640 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.514 141640 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.514 141640 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.514 141640 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.514 141640 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.514 141640 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.514 141640 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.514 141640 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.514 141640 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.515 141640 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.515 141640 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.515 141640 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.515 141640 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.515 141640 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.515 141640 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.515 141640 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.515 141640 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.515 141640 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.515 141640 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.516 141640 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.516 141640 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.516 141640 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.516 141640 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.516 141640 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.516 141640 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.516 141640 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.516 141640 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.516 141640 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.516 141640 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.517 141640 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.517 141640 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.517 141640 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.517 141640 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.517 141640 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.517 141640 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.517 141640 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.517 141640 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.517 141640 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.518 141640 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.518 141640 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.518 141640 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.518 141640 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.518 141640 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.518 141640 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.518 141640 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.518 141640 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.518 141640 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.518 141640 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.519 141640 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.519 141640 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.519 141640 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.519 141640 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.519 141640 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.519 141640 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.519 141640 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.519 141640 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.519 141640 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.519 141640 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.520 141640 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.520 141640 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.520 141640 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.520 141640 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.520 141640 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.520 141640 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.520 141640 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.520 141640 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.520 141640 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.520 141640 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.520 141640 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.521 141640 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.521 141640 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.521 141640 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.521 141640 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.521 141640 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.521 141640 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.521 141640 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.521 141640 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.521 141640 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.522 141640 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.522 141640 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.522 141640 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.522 141640 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.522 141640 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.522 141640 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.522 141640 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.522 141640 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.522 141640 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.523 141640 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.523 141640 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.523 141640 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.523 141640 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.523 141640 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.523 141640 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.523 141640 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.523 141640 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.523 141640 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.523 141640 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.524 141640 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.524 141640 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.524 141640 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.524 141640 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.524 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.524 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.524 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.524 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.524 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.524 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.525 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.525 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.525 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.525 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.525 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.525 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.525 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.525 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.525 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.525 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.526 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.526 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.526 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.526 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.526 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.526 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.526 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.526 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.526 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.527 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.527 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.527 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.527 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.527 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.527 141640 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.527 141640 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.527 141640 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.527 141640 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.527 141640 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 04:51:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:51:55.528 141640 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  6 04:51:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:55 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c5b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:56.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:51:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:57.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:51:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:57 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004bb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:58 np0005548918 systemd-logind[800]: New session 52 of user zuul.
Dec  6 04:51:58 np0005548918 systemd[1]: Started Session 52 of User zuul.
Dec  6 04:51:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c5d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:51:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:58.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:59 np0005548918 python3.9[141916]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:51:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:51:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:51:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:59.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:51:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:51:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:51:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:51:59 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:51:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:51:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Dec  6 04:52:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Dec  6 04:52:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:00 np0005548918 python3.9[142074]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c5f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:52:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:00.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:52:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:01.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:01 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:02 np0005548918 python3.9[142240]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:52:02 np0005548918 systemd[1]: Reloading.
Dec  6 04:52:02 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:52:02 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:52:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:02.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:03.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:03 np0005548918 python3.9[142426]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:52:03 np0005548918 network[142444]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:52:03 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:03 np0005548918 network[142445]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:52:03 np0005548918 network[142446]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:52:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:03 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c610 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:04.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:05.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:05 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:52:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:06.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:52:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:52:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:07.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:52:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095207 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:52:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:07 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:08 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:08.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:09 np0005548918 python3.9[142737]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:52:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:09.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:09 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:09 np0005548918 python3.9[142891]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:52:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:10 np0005548918 python3.9[143045]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:52:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:10.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:11.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:11 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:12 np0005548918 python3.9[143270]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:52:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:12 np0005548918 python3.9[143434]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:52:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:12.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:13.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:13 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:52:13 np0005548918 python3.9[143588]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:52:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:13 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:13 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:14 np0005548918 podman[143714]: 2025-12-06 09:52:14.464772088 +0000 UTC m=+0.133197283 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Dec  6 04:52:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:14 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:52:14 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:52:14 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:52:14 np0005548918 python3.9[143761]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:52:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00c6f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:52:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:14.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:52:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:15.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:15 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:16 np0005548918 python3.9[143923]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:16.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:17.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:17 np0005548918 python3.9[144076]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:17 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:18 np0005548918 python3.9[144229]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:52:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:18 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:18 np0005548918 python3.9[144381]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:18.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:52:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:19.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:52:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:19 np0005548918 python3.9[144534]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:19 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:20 np0005548918 python3.9[144687]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:21.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:21 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:52:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:21 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:52:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:21.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:21 np0005548918 python3.9[144839]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:21 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:22 np0005548918 podman[144941]: 2025-12-06 09:52:22.166350637 +0000 UTC m=+0.047869731 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  6 04:52:22 np0005548918 python3.9[145009]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002600 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:22 np0005548918 python3.9[145161]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:23.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:23.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:23 np0005548918 python3.9[145314]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:23 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:23 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:24 np0005548918 python3.9[145467]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:52:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:24 np0005548918 python3.9[145619]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:52:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:25.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:52:25 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:52:25 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:52:25 np0005548918 python3.9[145797]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:25.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:25 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:25 np0005548918 python3.9[145950]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:52:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:27.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:27.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:27 np0005548918 python3.9[146103]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:27 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:28 np0005548918 python3.9[146281]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  6 04:52:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:28 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:52:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:29.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:52:29 np0005548918 python3.9[146433]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:52:29 np0005548918 systemd[1]: Reloading.
Dec  6 04:52:29 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:52:29 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:52:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:29.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095229 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:52:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:29 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:30 np0005548918 python3.9[146621]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:31 np0005548918 python3.9[146774]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:31.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:52:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:31.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:52:31 np0005548918 python3.9[146928]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:31 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:32 np0005548918 python3.9[147082]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:32 np0005548918 python3.9[147235]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:52:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:33.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:52:33 np0005548918 python3.9[147389]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:33.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:33 np0005548918 python3.9[147542]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:52:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:33 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:35.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:35 np0005548918 python3.9[147697]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec  6 04:52:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:52:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:35.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:52:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:35 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:36 np0005548918 python3.9[147851]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  6 04:52:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:37.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:52:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:37.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:52:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:37 np0005548918 python3.9[148010]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  6 04:52:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:37 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:52:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:39.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:52:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:52:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:39.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:52:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:39 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:40 np0005548918 python3.9[148173]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:52:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:41.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:41 np0005548918 python3.9[148257]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:52:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:41.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:41 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:43.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:43.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:43 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:43 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:52:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:45.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:52:45 np0005548918 podman[148273]: 2025-12-06 09:52:45.235063183 +0000 UTC m=+0.099822161 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec  6 04:52:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:45.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:45 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:47.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:47.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:47 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:48 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003480 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:49.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:52:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:49.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:52:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:49 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:52:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:51.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:52:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:51.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:51 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003480 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:52:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:53.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:52:53 np0005548918 podman[148334]: 2025-12-06 09:52:53.217620827 +0000 UTC m=+0.093558604 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec  6 04:52:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:53.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:52:53.656 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:52:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:52:53.658 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:52:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:52:53.658 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:52:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:53 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:53 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003480 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:55.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:55.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:55 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003480 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:52:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:57.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:52:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:57.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:57 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:52:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:52:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:59.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:52:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:52:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:59.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:52:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:52:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:52:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:52:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:52:59 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003480 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:01.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:01.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:01 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003480 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:03.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:03.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:03 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:03 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003480 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:05.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:05.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:05 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:07.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:07.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:07 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:08 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:09.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:09.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:09 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300026d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:11.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:11.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:11 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80280047e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:13.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:13.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:13 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:13 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:15.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:15.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:15 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:16 np0005548918 podman[148584]: 2025-12-06 09:53:16.2027496 +0000 UTC m=+0.090208777 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec  6 04:53:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:17.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:17.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:17 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:18 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:19.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:19.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:21.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:21.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:21 np0005548918 kernel: SELinux:  Converting 2774 SID table entries...
Dec  6 04:53:21 np0005548918 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:53:21 np0005548918 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:53:21 np0005548918 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:53:21 np0005548918 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:53:21 np0005548918 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:53:21 np0005548918 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:53:21 np0005548918 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:53:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c0036a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:23.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:23.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:23 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:24 np0005548918 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Dec  6 04:53:24 np0005548918 podman[148626]: 2025-12-06 09:53:24.195295218 +0000 UTC m=+0.079126846 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  6 04:53:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c0036c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:25.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:25.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:26 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:53:26 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:53:26 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:53:26 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:53:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c0036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:27.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:27.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095327 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:53:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034004c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:28 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:29.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:29.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:31.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:53:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:53:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:31.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:31 np0005548918 kernel: SELinux:  Converting 2774 SID table entries...
Dec  6 04:53:31 np0005548918 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:53:31 np0005548918 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:53:31 np0005548918 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:53:31 np0005548918 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:53:31 np0005548918 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:53:31 np0005548918 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:53:31 np0005548918 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:53:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:33.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:33.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:35.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:35.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:53:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:37.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:37.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:39.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:39 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:53:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:39 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:53:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:39.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:41.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:41.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:53:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:43.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:43.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:43 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c00cf60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:45.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:45.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:47 np0005548918 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec  6 04:53:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:47.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:47 np0005548918 podman[150828]: 2025-12-06 09:53:47.253350913 +0000 UTC m=+0.134267192 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  6 04:53:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:47.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095347 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:53:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058002b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:48 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:49.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:49.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058003840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  6 04:53:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:51.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  6 04:53:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:51.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:53:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:53.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:53:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:53.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:53:53.658 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:53:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:53:53.658 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:53:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:53:53.658 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:53:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:53 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058003840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058003840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:55.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:55 np0005548918 podman[156106]: 2025-12-06 09:53:55.16512948 +0000 UTC m=+0.058901569 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Dec  6 04:53:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:55.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058003840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:57.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:57.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:53:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:53:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:53:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:59.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:53:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:53:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:59.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:53:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:53:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:53:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:53:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058003840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:00 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:01.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:01.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058003840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:02 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:03.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:03.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:03 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:04 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8058003840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:05.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:05.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:06 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:07.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:07.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580057a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:08 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:08 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8060008ed0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:09.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:09.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580057a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:10 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:11.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:11.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:12 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:13.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:13.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:13 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580057a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:14 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:15.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:15.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80580057a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:16 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:17.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:17.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:18 np0005548918 podman[165742]: 2025-12-06 09:54:18.220119942 +0000 UTC m=+0.105497872 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec  6 04:54:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:18 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:18 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:19.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:19.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:20 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:21.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:21.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:22 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:23.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:23.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:23 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095424 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:54:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:24 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:25.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:25.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:25 np0005548918 kernel: SELinux:  Converting 2775 SID table entries...
Dec  6 04:54:26 np0005548918 kernel: SELinux:  policy capability network_peer_controls=1
Dec  6 04:54:26 np0005548918 kernel: SELinux:  policy capability open_perms=1
Dec  6 04:54:26 np0005548918 kernel: SELinux:  policy capability extended_socket_class=1
Dec  6 04:54:26 np0005548918 kernel: SELinux:  policy capability always_check_network=0
Dec  6 04:54:26 np0005548918 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  6 04:54:26 np0005548918 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  6 04:54:26 np0005548918 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  6 04:54:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:26 np0005548918 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec  6 04:54:26 np0005548918 podman[165782]: 2025-12-06 09:54:26.171948181 +0000 UTC m=+0.054301017 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec  6 04:54:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:26 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:27 np0005548918 dbus-broker-launch[747]: Noticed file-system modification, trigger reload.
Dec  6 04:54:27 np0005548918 dbus-broker-launch[747]: Noticed file-system modification, trigger reload.
Dec  6 04:54:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:27.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:27.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:28 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:28 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:29.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:29.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:30 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:31.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:31 np0005548918 podman[166184]: 2025-12-06 09:54:31.427056664 +0000 UTC m=+0.052316356 container exec 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Dec  6 04:54:31 np0005548918 podman[166184]: 2025-12-06 09:54:31.518094912 +0000 UTC m=+0.143354634 container exec_died 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Dec  6 04:54:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:54:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:31.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:54:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:32 np0005548918 podman[166316]: 2025-12-06 09:54:32.009597494 +0000 UTC m=+0.058992342 container exec 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:54:32 np0005548918 podman[166316]: 2025-12-06 09:54:32.022427104 +0000 UTC m=+0.071821952 container exec_died 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 04:54:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:32 np0005548918 podman[166407]: 2025-12-06 09:54:32.449683585 +0000 UTC m=+0.073454934 container exec 5bbd0707ce20ed32133f52fc5be40478c3e1dee6c3214441b79974078459fdb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Dec  6 04:54:32 np0005548918 podman[166407]: 2025-12-06 09:54:32.462753051 +0000 UTC m=+0.086524350 container exec_died 5bbd0707ce20ed32133f52fc5be40478c3e1dee6c3214441b79974078459fdb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS)
Dec  6 04:54:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:32 np0005548918 podman[166472]: 2025-12-06 09:54:32.736744979 +0000 UTC m=+0.074607635 container exec 291e33d7558df1250bc1d75586903aba6000ccad9dd3cb120f4999944db31c98 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna)
Dec  6 04:54:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:32 np0005548918 podman[166472]: 2025-12-06 09:54:32.751350925 +0000 UTC m=+0.089213611 container exec_died 291e33d7558df1250bc1d75586903aba6000ccad9dd3cb120f4999944db31c98 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna)
Dec  6 04:54:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:32 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:33 np0005548918 podman[166540]: 2025-12-06 09:54:33.117717117 +0000 UTC m=+0.153924263 container exec cbcabdb9b139bf7198b15438accb8f4a51fb667fdf4f19be3cdf7b28a8213220 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg, version=2.2.4, com.redhat.component=keepalived-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, name=keepalived, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec  6 04:54:33 np0005548918 podman[166540]: 2025-12-06 09:54:33.130446573 +0000 UTC m=+0.166653719 container exec_died cbcabdb9b139bf7198b15438accb8f4a51fb667fdf4f19be3cdf7b28a8213220 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, description=keepalived for Ceph, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, vcs-type=git)
Dec  6 04:54:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:33.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:33 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  6 04:54:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:33.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:33 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:54:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:34 np0005548918 systemd[1]: Stopping OpenSSH server daemon...
Dec  6 04:54:34 np0005548918 systemd[1]: sshd.service: Deactivated successfully.
Dec  6 04:54:34 np0005548918 systemd[1]: Stopped OpenSSH server daemon.
Dec  6 04:54:34 np0005548918 systemd[1]: sshd.service: Consumed 2.601s CPU time, read 32.0K from disk, written 0B to disk.
Dec  6 04:54:34 np0005548918 systemd[1]: Stopped target sshd-keygen.target.
Dec  6 04:54:34 np0005548918 systemd[1]: Stopping sshd-keygen.target...
Dec  6 04:54:34 np0005548918 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  6 04:54:34 np0005548918 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  6 04:54:34 np0005548918 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  6 04:54:34 np0005548918 systemd[1]: Reached target sshd-keygen.target.
Dec  6 04:54:34 np0005548918 systemd[1]: Starting OpenSSH server daemon...
Dec  6 04:54:34 np0005548918 systemd[1]: Started OpenSSH server daemon.
Dec  6 04:54:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:34 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:34 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:34 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  6 04:54:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:34 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:35.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:35.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  6 04:54:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:54:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:54:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:36 np0005548918 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:54:36 np0005548918 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:54:36 np0005548918 systemd[1]: Reloading.
Dec  6 04:54:36 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:54:36 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:54:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:36 np0005548918 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:54:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:54:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:54:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:36 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:37 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:54:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:37.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:37.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:38 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:54:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:39.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:54:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:39.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:54:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:40 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:41.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:41.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:42 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:42 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:54:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:42 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:43.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:43.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:43 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:44 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:45.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:45.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:46 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803c003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:46 np0005548918 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:54:46 np0005548918 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:54:46 np0005548918 systemd[1]: man-db-cache-update.service: Consumed 12.572s CPU time.
Dec  6 04:54:46 np0005548918 systemd[1]: run-rd197bb8bb226400aa6dc53fdc331f9a2.service: Deactivated successfully.
Dec  6 04:54:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095446 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:54:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:47.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:47 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_38] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8030004b20 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:47 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80300045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:47.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:48 np0005548918 podman[176070]: 2025-12-06 09:54:48.503158037 +0000 UTC m=+0.138947097 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 04:54:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:48 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:48 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c0027a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:49.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:49.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:50 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:51.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:51.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c0027a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:52 np0005548918 python3.9[176230]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:54:52 np0005548918 systemd[1]: Reloading.
Dec  6 04:54:52 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:54:52 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:54:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:52 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:53.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:53.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:54:53.660 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:54:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:54:53.661 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:54:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:54:53.661 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:54:53 np0005548918 python3.9[176421]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:54:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:53 np0005548918 systemd[1]: Reloading.
Dec  6 04:54:53 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:54:53 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:54:53 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:54 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002940 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:54 np0005548918 python3.9[176611]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:54:54 np0005548918 systemd[1]: Reloading.
Dec  6 04:54:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:55 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:55 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:54:55 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:54:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:55.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:55.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:56 np0005548918 python3.9[176802]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:54:56 np0005548918 systemd[1]: Reloading.
Dec  6 04:54:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:56 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:54:56 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:54:56 np0005548918 podman[176841]: 2025-12-06 09:54:56.453670511 +0000 UTC m=+0.062668559 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  6 04:54:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:56 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:57 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f804c002940 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:57.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:57.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:57 np0005548918 python3.9[177012]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:54:57 np0005548918 systemd[1]: Reloading.
Dec  6 04:54:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:57 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:54:57 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:54:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8028001b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:58 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:54:58 np0005548918 python3.9[177203]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:54:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:58 np0005548918 systemd[1]: Reloading.
Dec  6 04:54:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:54:58 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:54:58 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:54:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:59 np0005548918 kernel: ganesha.nfsd[148581]: segfault at 50 ip 00007f810bad232e sp 00007f80c37fd210 error 4 in libntirpc.so.5.8[7f810bab7000+2c000] likely on CPU 4 (core 0, socket 4)
Dec  6 04:54:59 np0005548918 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 04:54:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[95101]: 06/12/2025 09:54:59 : epoch 6933fb34 : compute-2 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f806000b460 fd 39 proxy ignored for local
Dec  6 04:54:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:54:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:59.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:54:59 np0005548918 systemd[1]: Started Process Core Dump (PID 177242/UID 0).
Dec  6 04:54:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:54:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:54:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:59.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:54:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:54:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:54:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:54:59 np0005548918 python3.9[177396]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:00 np0005548918 systemd[1]: Reloading.
Dec  6 04:55:00 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:55:00 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:55:00 np0005548918 systemd-coredump[177244]: Process 95105 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 85:#012#0  0x00007f810bad232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 04:55:00 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:55:00 np0005548918 systemd[1]: systemd-coredump@1-177242-0.service: Deactivated successfully.
Dec  6 04:55:00 np0005548918 systemd[1]: systemd-coredump@1-177242-0.service: Consumed 1.249s CPU time.
Dec  6 04:55:00 np0005548918 podman[177468]: 2025-12-06 09:55:00.557727195 +0000 UTC m=+0.028646228 container died 5bbd0707ce20ed32133f52fc5be40478c3e1dee6c3214441b79974078459fdb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 04:55:00 np0005548918 systemd[1]: var-lib-containers-storage-overlay-f3c285c64e673bac02ec67e13d83178d050b1449d57c31d970b1d4055eba249c-merged.mount: Deactivated successfully.
Dec  6 04:55:00 np0005548918 podman[177468]: 2025-12-06 09:55:00.600337692 +0000 UTC m=+0.071256705 container remove 5bbd0707ce20ed32133f52fc5be40478c3e1dee6c3214441b79974078459fdb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  6 04:55:00 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Main process exited, code=exited, status=139/n/a
Dec  6 04:55:00 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Failed with result 'exit-code'.
Dec  6 04:55:00 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 2.352s CPU time.
Dec  6 04:55:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:01.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:01 np0005548918 python3.9[177636]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:01.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:03.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:03 np0005548918 python3.9[177793]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:03 np0005548918 systemd[1]: Reloading.
Dec  6 04:55:03 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:55:03 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:55:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:03.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:03 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095505 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:55:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:05.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:05.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:06 np0005548918 python3.9[177986]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  6 04:55:06 np0005548918 systemd[1]: Reloading.
Dec  6 04:55:06 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:55:06 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:55:06 np0005548918 systemd[1]: Listening on libvirt proxy daemon socket.
Dec  6 04:55:06 np0005548918 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec  6 04:55:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:07.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:07 np0005548918 python3.9[178180]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:55:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:07.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:55:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:08 np0005548918 python3.9[178336]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:08 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:09.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:09 np0005548918 python3.9[178517]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:09.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:10 np0005548918 python3.9[178673]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:10 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Scheduled restart job, restart counter is at 2.
Dec  6 04:55:10 np0005548918 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:55:10 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 2.352s CPU time.
Dec  6 04:55:10 np0005548918 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:55:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:11 np0005548918 podman[178875]: 2025-12-06 09:55:11.108854333 +0000 UTC m=+0.070896276 container create 66ea029f16e25b79929c69fe4d2619fcdbba8859c307f483b2af383d02c317dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec  6 04:55:11 np0005548918 podman[178875]: 2025-12-06 09:55:11.060656299 +0000 UTC m=+0.022698252 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:55:11 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/772b008259e97b733137c23e52f6666d74d16c2e1b24061694396786477ebb89/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 04:55:11 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/772b008259e97b733137c23e52f6666d74d16c2e1b24061694396786477ebb89/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:55:11 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/772b008259e97b733137c23e52f6666d74d16c2e1b24061694396786477ebb89/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:55:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:11.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:11 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/772b008259e97b733137c23e52f6666d74d16c2e1b24061694396786477ebb89/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.sseuqb-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:55:11 np0005548918 podman[178875]: 2025-12-06 09:55:11.191179872 +0000 UTC m=+0.153221845 container init 66ea029f16e25b79929c69fe4d2619fcdbba8859c307f483b2af383d02c317dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Dec  6 04:55:11 np0005548918 podman[178875]: 2025-12-06 09:55:11.198158746 +0000 UTC m=+0.160200699 container start 66ea029f16e25b79929c69fe4d2619fcdbba8859c307f483b2af383d02c317dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  6 04:55:11 np0005548918 bash[178875]: 66ea029f16e25b79929c69fe4d2619fcdbba8859c307f483b2af383d02c317dd
Dec  6 04:55:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:11 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 04:55:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:11 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 04:55:11 np0005548918 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:55:11 np0005548918 python3.9[178847]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:11 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 04:55:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:11 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 04:55:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:11 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 04:55:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:11 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 04:55:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:11 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 04:55:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:11 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:55:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:11.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:12 np0005548918 python3.9[179088]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:13.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:13 np0005548918 python3.9[179243]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:13.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:13 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:14 np0005548918 python3.9[179399]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:15 np0005548918 python3.9[179555]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:15.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:15.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:16 np0005548918 python3.9[179711]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:16 np0005548918 python3.9[179867]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:17.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:17 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:55:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:17 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:55:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:17.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:18 np0005548918 python3.9[180023]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:18 np0005548918 podman[180151]: 2025-12-06 09:55:18.694868915 +0000 UTC m=+0.139802329 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 04:55:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:18 np0005548918 python3.9[180198]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:18 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:19.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:19.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:19 np0005548918 python3.9[180361]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  6 04:55:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:21.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:21.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:23.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:23 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:55:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:23.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:23 np0005548918 python3.9[180526]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:23 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:24 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6824000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:24 np0005548918 python3.9[180688]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:55:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:24 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6820001c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:24 np0005548918 python3.9[180840]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:55:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:25 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:25.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:25 np0005548918 python3.9[180993]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:55:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:25.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:26 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67f8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:26 np0005548918 python3.9[181146]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:55:26 np0005548918 podman[181270]: 2025-12-06 09:55:26.695456163 +0000 UTC m=+0.069661164 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec  6 04:55:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:26 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:26 np0005548918 python3.9[181315]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:55:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:27 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6820001c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095527 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:55:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:27.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:27.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:28 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:28 np0005548918 python3.9[181472]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:28 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67f80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:28 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:29 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:29 np0005548918 python3.9[181622]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014927.6618974-1624-209074601089006/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:29.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:55:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:29.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:55:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:29 np0005548918 python3.9[181775]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:30 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6820002b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:30 np0005548918 python3.9[181901]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014929.297714-1624-16955763013499/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:30 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:31 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67f80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:31.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:31 np0005548918 python3.9[182054]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:31.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:32 np0005548918 python3.9[182180]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014930.8671315-1624-192499543050030/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:32 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:32 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6820002b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:32 np0005548918 python3.9[182332]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:33 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:33.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:33 np0005548918 python3.9[182458]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014932.2642379-1624-237182739834410/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:33.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:34 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67f80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:34 np0005548918 python3.9[182611]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:34 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:34 np0005548918 python3.9[182736]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014933.7607853-1624-163184932715249/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:35 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6820002b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:35.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:35.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:35 np0005548918 python3.9[182889]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:36 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:36 np0005548918 python3.9[183015]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014935.1411233-1624-247147138950786/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:36 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67f8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:37 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:37 np0005548918 python3.9[183167]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:37.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:37.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:37 np0005548918 python3.9[183291]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014936.5677288-1624-191819788228056/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:38 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6820002b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:38 np0005548918 python3.9[183444]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:38 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:39 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67f8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:39.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:39 np0005548918 python3.9[183569]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014937.986495-1624-169089724379839/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:39.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:40 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:40 np0005548918 python3.9[183723]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec  6 04:55:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:40 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6820002b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:41 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:41.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:41 np0005548918 python3.9[183877]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:41.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:42 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67f8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:42 np0005548918 python3.9[184030]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:42 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:43 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6820002b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:43 np0005548918 python3.9[184249]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:43.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:43.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:43 np0005548918 python3.9[184416]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:43 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:44 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:44 np0005548918 python3.9[184569]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:44 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:45 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:45 np0005548918 python3.9[184721]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:45.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:45.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:45 np0005548918 python3.9[184874]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:46 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6820002b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:46 np0005548918 python3.9[185027]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:46 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:55:46 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:55:46 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:55:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:46 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:47 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:47.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:47 np0005548918 python3.9[185180]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:47.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:55:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:55:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:55:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:48 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:48 np0005548918 python3.9[185333]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:48 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:48 np0005548918 python3.9[185510]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:48 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:49 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:49 np0005548918 podman[185562]: 2025-12-06 09:55:49.208391026 +0000 UTC m=+0.089706654 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:55:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:49.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:49 np0005548918 python3.9[185689]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:49.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:50 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:50 np0005548918 python3.9[185842]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:50 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6820002b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:50 np0005548918 python3.9[185994]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:51 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6820002b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:51.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:51.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:52 np0005548918 python3.9[186148]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:52 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:52 np0005548918 python3.9[186296]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014951.5045094-2288-175666393247991/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:52 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:52 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:55:52 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:55:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:53 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6820002b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:55:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:53.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:55:53 np0005548918 python3.9[186449]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:55:53.661 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:55:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:55:53.662 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:55:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:55:53.662 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:55:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:53.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:53 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:54 np0005548918 python3.9[186573]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014952.8753939-2288-174809968728853/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:54 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:54 np0005548918 python3.9[186725]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:54 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:55 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:55.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:55 np0005548918 python3.9[186849]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014954.2031355-2288-261812407888794/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:55.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:56 np0005548918 python3.9[187002]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:56 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6820002b30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:56 np0005548918 python3.9[187127]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014955.5131295-2288-204941309759410/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:56 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:57 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6808000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:57 np0005548918 podman[187251]: 2025-12-06 09:55:57.055086573 +0000 UTC m=+0.104228318 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  6 04:55:57 np0005548918 python3.9[187298]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:55:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:57.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:55:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:57.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:57 np0005548918 python3.9[187422]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014956.6745243-2288-268678973442237/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:58 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:58 np0005548918 python3.9[187575]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:58 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68180012c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:55:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:55:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:55:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:55:59 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:55:59 np0005548918 python3.9[187698]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014958.0430229-2288-237367563468055/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:55:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:59.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:55:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:55:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:59.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:55:59 np0005548918 python3.9[187851]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:55:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:55:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:00 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6808001aa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:00 np0005548918 python3.9[187975]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014959.2380958-2288-105702174401233/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:00 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:00 np0005548918 python3.9[188127]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:01 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:01.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:01 np0005548918 python3.9[188251]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014960.475481-2288-264039388681225/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:01.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:02 np0005548918 python3.9[188404]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:02 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:02 np0005548918 python3.9[188527]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014961.6741939-2288-236333391835033/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:02 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6808001aa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:03 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6818001dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:03.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:03 np0005548918 python3.9[188679]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:03.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:03 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:03 np0005548918 python3.9[188803]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014962.8102767-2288-249251056023979/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:04 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:04 np0005548918 python3.9[188956]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:04 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:05 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68080027b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:05.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:05 np0005548918 python3.9[189079]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014964.1275194-2288-97468434869061/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:05.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:05 np0005548918 python3.9[189232]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:06 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68180026e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:06 np0005548918 python3.9[189356]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014965.4258225-2288-35914568195725/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:06 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:07 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:07 np0005548918 python3.9[189508]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:07.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:07.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:07 np0005548918 python3.9[189632]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014966.6851258-2288-99080927498145/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095607 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:56:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:08 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6808002930 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:08 np0005548918 python3.9[189785]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:08 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68180026e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:08 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:09 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:09 np0005548918 python3.9[189933]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014967.9909778-2288-45175755269489/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:09.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:09.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.010389) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970010501, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4375, "num_deletes": 501, "total_data_size": 11830738, "memory_usage": 11979128, "flush_reason": "Manual Compaction"}
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Dec  6 04:56:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970063082, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 4439644, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13321, "largest_seqno": 17691, "table_properties": {"data_size": 4428209, "index_size": 6457, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3909, "raw_key_size": 30991, "raw_average_key_size": 19, "raw_value_size": 4401079, "raw_average_value_size": 2824, "num_data_blocks": 282, "num_entries": 1558, "num_filter_entries": 1558, "num_deletions": 501, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014557, "oldest_key_time": 1765014557, "file_creation_time": 1765014970, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 52737 microseconds, and 18719 cpu microseconds.
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.063142) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 4439644 bytes OK
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.063164) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.064945) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.064959) EVENT_LOG_v1 {"time_micros": 1765014970064955, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.064975) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 11811528, prev total WAL file size 11811528, number of live WAL files 2.
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.066904) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353031' seq:0, type:0; will stop at (end)
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(4335KB)], [27(13MB)]
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970066958, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 18464522, "oldest_snapshot_seqno": -1}
Dec  6 04:56:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:10 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 5030 keys, 13936070 bytes, temperature: kUnknown
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970229217, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 13936070, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13900441, "index_size": 21951, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12613, "raw_key_size": 125894, "raw_average_key_size": 25, "raw_value_size": 13807292, "raw_average_value_size": 2744, "num_data_blocks": 917, "num_entries": 5030, "num_filter_entries": 5030, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765014970, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.229489) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 13936070 bytes
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.230565) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 113.7 rd, 85.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.2, 13.4 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(7.3) write-amplify(3.1) OK, records in: 5851, records dropped: 821 output_compression: NoCompression
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.230586) EVENT_LOG_v1 {"time_micros": 1765014970230574, "job": 14, "event": "compaction_finished", "compaction_time_micros": 162346, "compaction_time_cpu_micros": 27146, "output_level": 6, "num_output_files": 1, "total_output_size": 13936070, "num_input_records": 5851, "num_output_records": 5030, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970231513, "job": 14, "event": "table_file_deletion", "file_number": 29}
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970234024, "job": 14, "event": "table_file_deletion", "file_number": 27}
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.066849) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.234085) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.234089) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.234091) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.234093) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:10 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:10.234094) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:10 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6808003250 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:11 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68180026e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:11.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:11 np0005548918 python3.9[190086]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:56:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:11.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:12 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:12 np0005548918 python3.9[190242]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec  6 04:56:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:12 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:13 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6808003250 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:13.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:13.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:13 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:14 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6808003250 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:14 np0005548918 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec  6 04:56:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:14 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f67fc003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:14 np0005548918 python3.9[190400]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[178891]: 06/12/2025 09:56:15 : epoch 6933fd7f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6804004050 fd 39 proxy ignored for local
Dec  6 04:56:15 np0005548918 kernel: ganesha.nfsd[180527]: segfault at 50 ip 00007f68d1f1232e sp 00007f6896ffc210 error 4 in libntirpc.so.5.8[7f68d1ef7000+2c000] likely on CPU 4 (core 0, socket 4)
Dec  6 04:56:15 np0005548918 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 04:56:15 np0005548918 systemd[1]: Started Process Core Dump (PID 190469/UID 0).
Dec  6 04:56:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:15.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:15 np0005548918 python3.9[190555]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:15.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:16 np0005548918 python3.9[190708]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:16 np0005548918 systemd-coredump[190484]: Process 178895 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 46:#012#0  0x00007f68d1f1232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 04:56:16 np0005548918 systemd[1]: systemd-coredump@2-190469-0.service: Deactivated successfully.
Dec  6 04:56:16 np0005548918 systemd[1]: systemd-coredump@2-190469-0.service: Consumed 1.043s CPU time.
Dec  6 04:56:16 np0005548918 podman[190749]: 2025-12-06 09:56:16.356823633 +0000 UTC m=+0.029321298 container died 66ea029f16e25b79929c69fe4d2619fcdbba8859c307f483b2af383d02c317dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:56:16 np0005548918 systemd[1]: var-lib-containers-storage-overlay-772b008259e97b733137c23e52f6666d74d16c2e1b24061694396786477ebb89-merged.mount: Deactivated successfully.
Dec  6 04:56:16 np0005548918 podman[190749]: 2025-12-06 09:56:16.396291242 +0000 UTC m=+0.068788907 container remove 66ea029f16e25b79929c69fe4d2619fcdbba8859c307f483b2af383d02c317dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:56:16 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Main process exited, code=exited, status=139/n/a
Dec  6 04:56:16 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Failed with result 'exit-code'.
Dec  6 04:56:16 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.201s CPU time.
Dec  6 04:56:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:16 np0005548918 python3.9[190907]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:17.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:17 np0005548918 python3.9[191060]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:17.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:18 np0005548918 python3.9[191213]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:18 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:19.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:19 np0005548918 python3.9[191366]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:19 np0005548918 podman[191367]: 2025-12-06 09:56:19.479113636 +0000 UTC m=+0.099516672 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 04:56:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:19.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:19 np0005548918 python3.9[191545]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:20 np0005548918 python3.9[191697]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095621 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:56:21 np0005548918 python3.9[191849]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:21.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:21.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:22 np0005548918 python3.9[192003]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:56:22 np0005548918 systemd[1]: Reloading.
Dec  6 04:56:22 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:56:22 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:56:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:22 np0005548918 systemd[1]: Starting libvirt logging daemon socket...
Dec  6 04:56:22 np0005548918 systemd[1]: Listening on libvirt logging daemon socket.
Dec  6 04:56:22 np0005548918 systemd[1]: Starting libvirt logging daemon admin socket...
Dec  6 04:56:22 np0005548918 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec  6 04:56:22 np0005548918 systemd[1]: Starting libvirt logging daemon...
Dec  6 04:56:22 np0005548918 systemd[1]: Started libvirt logging daemon.
Dec  6 04:56:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:23.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:56:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:23.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:56:23 np0005548918 python3.9[192198]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:56:23 np0005548918 systemd[1]: Reloading.
Dec  6 04:56:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:23 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:56:23 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:56:23 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:24 np0005548918 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec  6 04:56:24 np0005548918 systemd[1]: Starting libvirt nodedev daemon socket...
Dec  6 04:56:24 np0005548918 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec  6 04:56:24 np0005548918 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec  6 04:56:24 np0005548918 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec  6 04:56:24 np0005548918 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec  6 04:56:24 np0005548918 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec  6 04:56:24 np0005548918 systemd[1]: Starting libvirt nodedev daemon...
Dec  6 04:56:24 np0005548918 systemd[1]: Started libvirt nodedev daemon.
Dec  6 04:56:24 np0005548918 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec  6 04:56:24 np0005548918 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec  6 04:56:24 np0005548918 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec  6 04:56:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:24 np0005548918 python3.9[192423]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:56:24 np0005548918 systemd[1]: Reloading.
Dec  6 04:56:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:25 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:56:25 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:56:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:25.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:25 np0005548918 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec  6 04:56:25 np0005548918 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec  6 04:56:25 np0005548918 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec  6 04:56:25 np0005548918 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec  6 04:56:25 np0005548918 systemd[1]: Starting libvirt proxy daemon...
Dec  6 04:56:25 np0005548918 systemd[1]: Started libvirt proxy daemon.
Dec  6 04:56:25 np0005548918 setroubleshoot[192236]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 831da983-71f9-4007-87dd-df1a98c6b4b7
Dec  6 04:56:25 np0005548918 setroubleshoot[192236]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  6 04:56:25 np0005548918 setroubleshoot[192236]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 831da983-71f9-4007-87dd-df1a98c6b4b7
Dec  6 04:56:25 np0005548918 setroubleshoot[192236]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  6 04:56:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:25.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:26 np0005548918 python3.9[192639]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:56:26 np0005548918 systemd[1]: Reloading.
Dec  6 04:56:26 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:56:26 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:56:26 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Scheduled restart job, restart counter is at 3.
Dec  6 04:56:26 np0005548918 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:56:26 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.201s CPU time.
Dec  6 04:56:26 np0005548918 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:56:26 np0005548918 systemd[1]: Listening on libvirt locking daemon socket.
Dec  6 04:56:26 np0005548918 systemd[1]: Starting libvirt QEMU daemon socket...
Dec  6 04:56:26 np0005548918 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec  6 04:56:26 np0005548918 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec  6 04:56:26 np0005548918 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec  6 04:56:26 np0005548918 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec  6 04:56:26 np0005548918 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec  6 04:56:26 np0005548918 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec  6 04:56:26 np0005548918 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec  6 04:56:26 np0005548918 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec  6 04:56:26 np0005548918 systemd[1]: Starting libvirt QEMU daemon...
Dec  6 04:56:26 np0005548918 systemd[1]: Started libvirt QEMU daemon.
Dec  6 04:56:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:26 np0005548918 podman[192772]: 2025-12-06 09:56:26.819534293 +0000 UTC m=+0.041827761 container create 87dfcb0e340d44130ec4ca12b2153164f196473dac821f5c24880f7da7a6e389 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Dec  6 04:56:26 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79b1c91bd479bcc5c3a7402f11e6b564ff1dd4c67e5fe5bfdfc7d5e587229828/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 04:56:26 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79b1c91bd479bcc5c3a7402f11e6b564ff1dd4c67e5fe5bfdfc7d5e587229828/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:56:26 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79b1c91bd479bcc5c3a7402f11e6b564ff1dd4c67e5fe5bfdfc7d5e587229828/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:56:26 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79b1c91bd479bcc5c3a7402f11e6b564ff1dd4c67e5fe5bfdfc7d5e587229828/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.sseuqb-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:56:26 np0005548918 podman[192772]: 2025-12-06 09:56:26.888823962 +0000 UTC m=+0.111117490 container init 87dfcb0e340d44130ec4ca12b2153164f196473dac821f5c24880f7da7a6e389 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  6 04:56:26 np0005548918 podman[192772]: 2025-12-06 09:56:26.895552461 +0000 UTC m=+0.117845949 container start 87dfcb0e340d44130ec4ca12b2153164f196473dac821f5c24880f7da7a6e389 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Dec  6 04:56:26 np0005548918 bash[192772]: 87dfcb0e340d44130ec4ca12b2153164f196473dac821f5c24880f7da7a6e389
Dec  6 04:56:26 np0005548918 podman[192772]: 2025-12-06 09:56:26.80355058 +0000 UTC m=+0.025844068 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:56:26 np0005548918 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:56:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:26 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 04:56:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:26 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 04:56:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:26 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 04:56:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:26 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 04:56:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:26 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 04:56:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:26 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 04:56:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:26 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 04:56:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:26 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:56:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:27 np0005548918 podman[192957]: 2025-12-06 09:56:27.212160313 +0000 UTC m=+0.094242941 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 04:56:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:27.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:27 np0005548918 python3.9[192958]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:56:27 np0005548918 systemd[1]: Reloading.
Dec  6 04:56:27 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:56:27 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:56:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:27.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:27 np0005548918 systemd[1]: Starting libvirt secret daemon socket...
Dec  6 04:56:27 np0005548918 systemd[1]: Listening on libvirt secret daemon socket.
Dec  6 04:56:27 np0005548918 systemd[1]: Starting libvirt secret daemon admin socket...
Dec  6 04:56:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:27 np0005548918 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec  6 04:56:27 np0005548918 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec  6 04:56:27 np0005548918 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec  6 04:56:27 np0005548918 systemd[1]: Starting libvirt secret daemon...
Dec  6 04:56:27 np0005548918 systemd[1]: Started libvirt secret daemon.
Dec  6 04:56:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:28 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:29.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:29 np0005548918 python3.9[193219]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:29.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:30 np0005548918 python3.9[193372]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  6 04:56:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:31 np0005548918 python3.9[193524]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:56:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:31.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:31.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:31 np0005548918 python3.9[193679]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  6 04:56:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:32 np0005548918 python3.9[193830]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:33 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec  6 04:56:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:33 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec  6 04:56:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:33 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:56:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:33 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:56:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:33 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 04:56:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:33 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:56:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:33 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:56:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:33 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:56:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:33 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 04:56:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:33 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:56:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:33 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:56:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:33 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:56:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:33.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:33 np0005548918 python3.9[193952]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014992.398172-3362-105119636868582/.source.xml follow=False _original_basename=secret.xml.j2 checksum=f7c948a7651e1e704e9fb6c67bea136c2b7876ec backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:33.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:34 np0005548918 python3.9[194105]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 5ecd3f74-dade-5fc4-92ce-8950ae424258#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:56:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:35 np0005548918 python3.9[194267]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:35.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:35 np0005548918 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec  6 04:56:35 np0005548918 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.052s CPU time.
Dec  6 04:56:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:35.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:35 np0005548918 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec  6 04:56:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095635 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:56:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:56:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:37.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:56:37 np0005548918 python3.9[194733]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:37.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:38 np0005548918 python3.9[194886]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:38 np0005548918 python3.9[195009]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014997.8468614-3527-96040985368089/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000012:nfs.cephfs.1: -2
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:39 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:56:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:39.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:39.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:39 np0005548918 python3.9[195175]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:40 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb68000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:40 np0005548918 python3.9[195331]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:40 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb5c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:41 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb44000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:41 np0005548918 python3.9[195409]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:41.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:41.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:41 np0005548918 python3.9[195562]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:42 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb3c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:42 np0005548918 python3.9[195641]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.e1tbyf93 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:42 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb540016e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095643 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:56:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:43 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb5c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:43.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:43 np0005548918 python3.9[195794]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.380856) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003380931, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 543, "num_deletes": 251, "total_data_size": 893228, "memory_usage": 903464, "flush_reason": "Manual Compaction"}
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003387886, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 589687, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17697, "largest_seqno": 18234, "table_properties": {"data_size": 586862, "index_size": 861, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6542, "raw_average_key_size": 18, "raw_value_size": 581332, "raw_average_value_size": 1665, "num_data_blocks": 39, "num_entries": 349, "num_filter_entries": 349, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014970, "oldest_key_time": 1765014970, "file_creation_time": 1765015003, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 7064 microseconds, and 2486 cpu microseconds.
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.387927) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 589687 bytes OK
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.387943) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.389884) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.389895) EVENT_LOG_v1 {"time_micros": 1765015003389892, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.389910) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 890100, prev total WAL file size 890100, number of live WAL files 2.
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.390427) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(575KB)], [30(13MB)]
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003390470, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 14525757, "oldest_snapshot_seqno": -1}
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4869 keys, 12334008 bytes, temperature: kUnknown
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003520485, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 12334008, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12300736, "index_size": 19978, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 123182, "raw_average_key_size": 25, "raw_value_size": 12211582, "raw_average_value_size": 2508, "num_data_blocks": 830, "num_entries": 4869, "num_filter_entries": 4869, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765015003, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.520680) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 12334008 bytes
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.521879) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 111.7 rd, 94.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 13.3 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(45.5) write-amplify(20.9) OK, records in: 5379, records dropped: 510 output_compression: NoCompression
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.521894) EVENT_LOG_v1 {"time_micros": 1765015003521887, "job": 16, "event": "compaction_finished", "compaction_time_micros": 130076, "compaction_time_cpu_micros": 21545, "output_level": 6, "num_output_files": 1, "total_output_size": 12334008, "num_input_records": 5379, "num_output_records": 4869, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003522069, "job": 16, "event": "table_file_deletion", "file_number": 32}
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003524426, "job": 16, "event": "table_file_deletion", "file_number": 30}
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.390344) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.524493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.524499) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.524501) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.524502) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:56:43.524504) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:56:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:43.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:43 np0005548918 python3.9[195872]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:43 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:44 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:44 np0005548918 python3.9[196025]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:56:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:44 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:45 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb54002200 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:45.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:45 np0005548918 python3[196179]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  6 04:56:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:45.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:46 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb5c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:46 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:47 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:47.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:47 np0005548918 python3.9[196333]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:47.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:47 np0005548918 python3.9[196411]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:48 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb54002200 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:48 np0005548918 python3.9[196564]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:48 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb5c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:48 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:49 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb5c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:49 np0005548918 python3.9[196667]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:49.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:49.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:49 np0005548918 auditd[708]: Audit daemon rotating log files
Dec  6 04:56:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:49 np0005548918 podman[196792]: 2025-12-06 09:56:49.803826949 +0000 UTC m=+0.103139498 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  6 04:56:49 np0005548918 python3.9[196836]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:50 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb3c0019e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:50 np0005548918 python3.9[196924]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:50 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb54002200 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:51 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:51 np0005548918 python3.9[197076]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:51.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:51 np0005548918 python3.9[197155]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:51.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:52 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb5c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:52 np0005548918 python3.9[197341]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:52 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:53 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb3c0019e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:53 np0005548918 python3.9[197512]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765015011.9499526-3902-34279425502960/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:53.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:56:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:56:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:56:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:56:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:56:53.662 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:56:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:56:53.663 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:56:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:56:53.663 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:56:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:53.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:53 np0005548918 python3.9[197666]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:53 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:54 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb54003690 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:54 np0005548918 python3.9[197818]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:56:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:54 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb5c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:55 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440036e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:55.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:55 np0005548918 python3.9[197974]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:56:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:55.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:56:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:56 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb3c002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:56 np0005548918 python3.9[198127]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:56:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:56 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440036e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:57 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb5c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:57.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:57 np0005548918 python3.9[198281]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:56:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:56:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:57.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:56:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:57 np0005548918 podman[198408]: 2025-12-06 09:56:57.904915464 +0000 UTC m=+0.070202414 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  6 04:56:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:58 np0005548918 python3.9[198454]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:56:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:58 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb54003690 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:58 np0005548918 python3.9[198635]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:56:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:58 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb3c002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:56:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:56:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:56:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:56:59 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440036e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:56:59 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:56:59 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:56:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:56:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:59.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:56:59 np0005548918 python3.9[198788]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:56:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:56:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:56:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:59.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:56:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:56:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:00 np0005548918 python3.9[198912]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015019.0588715-4118-90455283786027/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:00 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb5c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:00 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb54003690 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:00 np0005548918 python3.9[199064]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:57:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:01 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb3c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:01.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:01 np0005548918 python3.9[199188]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015020.4399865-4163-3538988536/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:01.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:02 np0005548918 python3.9[199341]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:57:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:02 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440036e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:02 np0005548918 python3.9[199464]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015021.6948514-4208-117113323520324/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:02 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb5c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:03 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb54003690 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:03.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:03 np0005548918 python3.9[199617]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:57:03 np0005548918 systemd[1]: Reloading.
Dec  6 04:57:03 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:57:03 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:57:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:03.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:03 np0005548918 systemd[1]: Reached target edpm_libvirt.target.
Dec  6 04:57:03 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:04 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb3c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:04 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440036e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:04 np0005548918 python3.9[199810]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  6 04:57:04 np0005548918 systemd[1]: Reloading.
Dec  6 04:57:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:05 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb5c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:05 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:57:05 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:57:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:05.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:05 np0005548918 systemd[1]: Reloading.
Dec  6 04:57:05 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:57:05 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:57:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:05.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:06 np0005548918 systemd[1]: session-52.scope: Deactivated successfully.
Dec  6 04:57:06 np0005548918 systemd[1]: session-52.scope: Consumed 3min 33.245s CPU time.
Dec  6 04:57:06 np0005548918 systemd-logind[800]: Session 52 logged out. Waiting for processes to exit.
Dec  6 04:57:06 np0005548918 systemd-logind[800]: Removed session 52.
Dec  6 04:57:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:06 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb54003690 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:06 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb3c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:07 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440036e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:57:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:07.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:57:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:07.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:08 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb5c003cc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:08 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb54003690 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:08 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:09 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb3c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:09.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:09.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:10 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440036e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:10 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb5c003cc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:11 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb54003690 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:11.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:11.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:11 np0005548918 systemd-logind[800]: New session 53 of user zuul.
Dec  6 04:57:11 np0005548918 systemd[1]: Started Session 53 of User zuul.
Dec  6 04:57:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:12 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb3c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:12 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440036e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:12 np0005548918 python3.9[200098]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:57:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:13 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440036e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:13.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:13.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:13 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:14 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440036e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:14 np0005548918 python3.9[200254]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:57:14 np0005548918 network[200271]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:57:14 np0005548918 network[200272]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:57:14 np0005548918 network[200273]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:57:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:14 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb3c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:15 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb68000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:15.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:15.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:16 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb68000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:16 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb38000d00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:17 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb38000d00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:17.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:17.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:18 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440038a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:18 np0005548918 python3.9[200550]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  6 04:57:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:18 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb30000d00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:18 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:19 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb68000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:19.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:19 np0005548918 python3.9[200635]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:57:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:19.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:20 np0005548918 podman[200638]: 2025-12-06 09:57:20.218298217 +0000 UTC m=+0.107270838 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 04:57:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:20 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb38001cc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:20 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440038c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:21 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb30001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:57:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:21.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:57:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:21.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:22 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb680089d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:22 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb38001cc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:23 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb440038e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:23.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:23.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:24 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:24 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb30001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:24 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb680089d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:25 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb38001cc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:25.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:25.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:26 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb44003900 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:26 np0005548918 python3.9[200823]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:57:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:26 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb30001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:27 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb680096e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:27.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:27 np0005548918 python3.9[200976]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:57:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:27.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:28 np0005548918 podman[201102]: 2025-12-06 09:57:28.149355418 +0000 UTC m=+0.052965047 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec  6 04:57:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:28 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb680096e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:28 np0005548918 python3.9[201146]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:57:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:28 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb44003920 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:28 np0005548918 python3.9[201298]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:57:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:29 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb30002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:29.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:29 np0005548918 python3.9[201477]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:57:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:29.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:30 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb380030a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:30 np0005548918 python3.9[201601]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015049.3118174-247-196067877021620/.source.iscsi _original_basename=.rfpg8zlu follow=False checksum=2f26a224dbf494d58f6630d76596dda7e38abdb5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:30 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb680096e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:31 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb44003940 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:31.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:31 np0005548918 python3.9[201754]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:31.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:32 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb30002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:32 np0005548918 python3.9[201907]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:32 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb380030a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:33 np0005548918 kernel: ganesha.nfsd[200000]: segfault at 50 ip 00007fcc111d832e sp 00007fcbe0ff8210 error 4 in libntirpc.so.5.8[7fcc111bd000+2c000] likely on CPU 4 (core 0, socket 4)
Dec  6 04:57:33 np0005548918 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 04:57:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[192832]: 06/12/2025 09:57:33 : epoch 6933fdca : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcb380030a0 fd 37 proxy ignored for local
Dec  6 04:57:33 np0005548918 systemd[1]: Started Process Core Dump (PID 202007/UID 0).
Dec  6 04:57:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:33.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:33 np0005548918 python3.9[202062]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:57:33 np0005548918 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec  6 04:57:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:33.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:34 np0005548918 systemd-coredump[202008]: Process 192844 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 57:#012#0  0x00007fcc111d832e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 04:57:34 np0005548918 systemd[1]: systemd-coredump@3-202007-0.service: Deactivated successfully.
Dec  6 04:57:34 np0005548918 systemd[1]: systemd-coredump@3-202007-0.service: Consumed 1.071s CPU time.
Dec  6 04:57:34 np0005548918 podman[202224]: 2025-12-06 09:57:34.411131139 +0000 UTC m=+0.024015918 container died 87dfcb0e340d44130ec4ca12b2153164f196473dac821f5c24880f7da7a6e389 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Dec  6 04:57:34 np0005548918 systemd[1]: var-lib-containers-storage-overlay-79b1c91bd479bcc5c3a7402f11e6b564ff1dd4c67e5fe5bfdfc7d5e587229828-merged.mount: Deactivated successfully.
Dec  6 04:57:34 np0005548918 podman[202224]: 2025-12-06 09:57:34.440514479 +0000 UTC m=+0.053399238 container remove 87dfcb0e340d44130ec4ca12b2153164f196473dac821f5c24880f7da7a6e389 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 04:57:34 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Main process exited, code=exited, status=139/n/a
Dec  6 04:57:34 np0005548918 python3.9[202220]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:57:34 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Failed with result 'exit-code'.
Dec  6 04:57:34 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.414s CPU time.
Dec  6 04:57:34 np0005548918 systemd[1]: Reloading.
Dec  6 04:57:34 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:57:34 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:57:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:34 np0005548918 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  6 04:57:34 np0005548918 systemd[1]: Starting Open-iSCSI...
Dec  6 04:57:34 np0005548918 kernel: Loading iSCSI transport class v2.0-870.
Dec  6 04:57:34 np0005548918 systemd[1]: Started Open-iSCSI.
Dec  6 04:57:35 np0005548918 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec  6 04:57:35 np0005548918 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec  6 04:57:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:35.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:35.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:35 np0005548918 python3.9[202465]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:57:36 np0005548918 network[202483]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:57:36 np0005548918 network[202484]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:57:36 np0005548918 network[202485]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:57:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:37.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:37.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:39 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095739 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:57:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:39.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:39.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:40 np0005548918 python3.9[202761]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  6 04:57:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:41.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:41 np0005548918 python3.9[202914]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec  6 04:57:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:41.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:42 np0005548918 python3.9[203071]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:57:42 np0005548918 python3.9[203194]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015061.6843476-478-156550103977315/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:43.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:43 np0005548918 python3.9[203347]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:43.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:44 np0005548918 python3.9[203500]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:57:44 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Scheduled restart job, restart counter is at 4.
Dec  6 04:57:44 np0005548918 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:57:44 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.414s CPU time.
Dec  6 04:57:44 np0005548918 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 04:57:44 np0005548918 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  6 04:57:44 np0005548918 systemd[1]: Stopped Load Kernel Modules.
Dec  6 04:57:44 np0005548918 systemd[1]: Stopping Load Kernel Modules...
Dec  6 04:57:44 np0005548918 systemd[1]: Starting Load Kernel Modules...
Dec  6 04:57:44 np0005548918 systemd[1]: Finished Load Kernel Modules.
Dec  6 04:57:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:44 np0005548918 podman[203554]: 2025-12-06 09:57:44.990516204 +0000 UTC m=+0.041092602 container create b5d95f66151d04075342e932f946282ba7899cc2664931d61ab4171c151f756b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  6 04:57:45 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07dede252d6d4d715c4c6238c1928f24d65a663b0e0d5407b5d663a6ecfde2d8/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 04:57:45 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07dede252d6d4d715c4c6238c1928f24d65a663b0e0d5407b5d663a6ecfde2d8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 04:57:45 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07dede252d6d4d715c4c6238c1928f24d65a663b0e0d5407b5d663a6ecfde2d8/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:57:45 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07dede252d6d4d715c4c6238c1928f24d65a663b0e0d5407b5d663a6ecfde2d8/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.sseuqb-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 04:57:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:45 np0005548918 podman[203554]: 2025-12-06 09:57:45.055303963 +0000 UTC m=+0.105880381 container init b5d95f66151d04075342e932f946282ba7899cc2664931d61ab4171c151f756b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 04:57:45 np0005548918 podman[203554]: 2025-12-06 09:57:44.969367612 +0000 UTC m=+0.019944020 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 04:57:45 np0005548918 podman[203554]: 2025-12-06 09:57:45.069384197 +0000 UTC m=+0.119960595 container start b5d95f66151d04075342e932f946282ba7899cc2664931d61ab4171c151f756b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  6 04:57:45 np0005548918 bash[203554]: b5d95f66151d04075342e932f946282ba7899cc2664931d61ab4171c151f756b
Dec  6 04:57:45 np0005548918 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 04:57:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:45 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 04:57:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:45 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 04:57:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:45 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 04:57:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:45 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 04:57:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:45 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 04:57:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:45 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 04:57:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:45 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 04:57:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:45 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 04:57:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:45.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:45 np0005548918 python3.9[203760]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:57:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:45.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:46 np0005548918 python3.9[203913]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:57:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:47.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:47 np0005548918 python3.9[204066]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:57:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:47.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:48 np0005548918 python3.9[204219]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:57:48 np0005548918 python3.9[204342]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015067.648224-653-129811354128829/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:49 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:49.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:49 np0005548918 python3.9[204520]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:57:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:49.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:50 np0005548918 podman[204674]: 2025-12-06 09:57:50.378415202 +0000 UTC m=+0.109495396 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  6 04:57:50 np0005548918 python3.9[204675]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:51 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 04:57:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:51 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 04:57:51 np0005548918 python3.9[204854]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:51.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:51.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:52 np0005548918 python3.9[205007]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:52 np0005548918 python3.9[205159]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:53.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:53 np0005548918 python3.9[205312]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:57:53.663 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:57:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:57:53.664 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:57:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:57:53.664 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:57:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:53.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:54 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:54 np0005548918 python3.9[205465]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:55.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:55 np0005548918 python3.9[205618]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:55.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:56 np0005548918 python3.9[205771]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:57:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:56 np0005548918 python3.9[205925]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:57.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 04:57:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:57.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:57:57 np0005548918 python3.9[206089]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:57:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:58 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5478000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:58 np0005548918 podman[206169]: 2025-12-06 09:57:58.625205174 +0000 UTC m=+0.079146771 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec  6 04:57:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:58 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:58 np0005548918 python3.9[206316]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:57:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:57:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:59 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:57:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:57:59 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:57:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:57:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:59.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:57:59 np0005548918 python3.9[206425]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:57:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:57:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:57:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:57:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:57:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:59.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:00 np0005548918 python3.9[206578]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:00 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:00 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:58:00 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:58:00 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:58:00 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:58:00 np0005548918 python3.9[206656]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:58:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:00 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095801 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 04:58:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:01 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.002000052s ======
Dec  6 04:58:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:01.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Dec  6 04:58:01 np0005548918 python3.9[206809]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:01.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:02 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f54540016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:02 np0005548918 python3.9[206962]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:02 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f54500016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:03 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:03 np0005548918 python3.9[207040]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:58:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:03.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:58:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:58:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:03.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:58:04 np0005548918 python3.9[207194]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:04 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:04 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:04 np0005548918 python3.9[207272]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:04 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f54540016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:05 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f54500016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:58:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:05.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:58:05 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:58:05 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:58:05 np0005548918 python3.9[207425]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:58:05 np0005548918 systemd[1]: Reloading.
Dec  6 04:58:05 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:05 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:58:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:05.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:58:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:06 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:06 np0005548918 python3.9[207640]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:06 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:07 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f54540016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:07.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:07 np0005548918 python3.9[207719]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:07.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:08 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f54500016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:08 np0005548918 python3.9[207872]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:08 np0005548918 python3.9[207950]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:08 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f54540016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:09 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:09 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:09.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:58:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:09.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:58:09 np0005548918 python3.9[208128]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:58:09 np0005548918 systemd[1]: Reloading.
Dec  6 04:58:10 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:10 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:10 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:10 np0005548918 systemd[1]: Starting Create netns directory...
Dec  6 04:58:10 np0005548918 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  6 04:58:10 np0005548918 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  6 04:58:10 np0005548918 systemd[1]: Finished Create netns directory.
Dec  6 04:58:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:10 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:11 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:11.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:11 np0005548918 python3.9[208325]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:58:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:11.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:12 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:12 np0005548918 python3.9[208478]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:12 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:12 np0005548918 python3.9[208601]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015091.7944505-1274-190263817568892/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:58:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:13 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:13.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:13.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:14 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:14 np0005548918 python3.9[208755]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:58:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:14 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:14 np0005548918 python3.9[208907]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:14 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:15 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:15.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:15 np0005548918 python3.9[209031]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015094.4531274-1348-819195455697/.source.json _original_basename=.1qajjigx follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:58:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:15.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:58:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:16 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:16 np0005548918 python3.9[209184]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:16 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:17 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:58:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:17.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:58:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:58:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:17.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:58:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:18 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:18 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:19 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:19 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:19 np0005548918 python3.9[209613]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec  6 04:58:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:19.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:19.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:20 np0005548918 python3.9[209767]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  6 04:58:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:20 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:20 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:21 np0005548918 podman[209891]: 2025-12-06 09:58:21.100479603 +0000 UTC m=+0.071998724 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 04:58:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:21 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:21 np0005548918 python3.9[209936]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  6 04:58:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:21.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:21.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:22 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:22 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:23 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:58:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:23.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:58:23 np0005548918 python3[210126]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  6 04:58:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:23.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:24 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:24 np0005548918 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec  6 04:58:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:24 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:24 np0005548918 podman[210140]: 2025-12-06 09:58:24.56521775 +0000 UTC m=+1.093105316 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842
Dec  6 04:58:24 np0005548918 podman[210200]: 2025-12-06 09:58:24.691004465 +0000 UTC m=+0.042750998 container create 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  6 04:58:24 np0005548918 podman[210200]: 2025-12-06 09:58:24.667882725 +0000 UTC m=+0.019629278 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842
Dec  6 04:58:24 np0005548918 python3[210126]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842
Dec  6 04:58:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:24 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:25 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:25.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:25 np0005548918 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  6 04:58:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:25.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:25 np0005548918 python3.9[210392]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:58:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:26 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:26 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:26 np0005548918 python3.9[210547]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:27 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:27.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:27 np0005548918 python3.9[210624]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:58:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:27.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:28 np0005548918 python3.9[210776]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765015107.5041234-1612-165945381666598/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:28 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:28 np0005548918 python3.9[210852]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:58:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:28 np0005548918 systemd[1]: Reloading.
Dec  6 04:58:28 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:28 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:28 np0005548918 podman[210854]: 2025-12-06 09:58:28.932307111 +0000 UTC m=+0.098531464 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:58:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:28 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:29 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:29.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:29 np0005548918 python3.9[211005]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:58:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:29 np0005548918 systemd[1]: Reloading.
Dec  6 04:58:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:29.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:29 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:29 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:30 np0005548918 systemd[1]: Starting multipathd container...
Dec  6 04:58:30 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:58:30 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fae8c3267380fb3101540188972b31cc3765922a8003751800bb0cfe5c443923/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  6 04:58:30 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fae8c3267380fb3101540188972b31cc3765922a8003751800bb0cfe5c443923/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  6 04:58:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:30 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:30 np0005548918 systemd[1]: Started /usr/bin/podman healthcheck run 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21.
Dec  6 04:58:30 np0005548918 podman[211045]: 2025-12-06 09:58:30.309875479 +0000 UTC m=+0.121710736 container init 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:58:30 np0005548918 multipathd[211060]: + sudo -E kolla_set_configs
Dec  6 04:58:30 np0005548918 podman[211045]: 2025-12-06 09:58:30.35126208 +0000 UTC m=+0.163097297 container start 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  6 04:58:30 np0005548918 podman[211045]: multipathd
Dec  6 04:58:30 np0005548918 systemd[1]: Started multipathd container.
Dec  6 04:58:30 np0005548918 multipathd[211060]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  6 04:58:30 np0005548918 multipathd[211060]: INFO:__main__:Validating config file
Dec  6 04:58:30 np0005548918 multipathd[211060]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  6 04:58:30 np0005548918 multipathd[211060]: INFO:__main__:Writing out command to execute
Dec  6 04:58:30 np0005548918 multipathd[211060]: ++ cat /run_command
Dec  6 04:58:30 np0005548918 multipathd[211060]: + CMD='/usr/sbin/multipathd -d'
Dec  6 04:58:30 np0005548918 multipathd[211060]: + ARGS=
Dec  6 04:58:30 np0005548918 multipathd[211060]: + sudo kolla_copy_cacerts
Dec  6 04:58:30 np0005548918 podman[211069]: 2025-12-06 09:58:30.415542515 +0000 UTC m=+0.056389284 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec  6 04:58:30 np0005548918 multipathd[211060]: + [[ ! -n '' ]]
Dec  6 04:58:30 np0005548918 multipathd[211060]: + . kolla_extend_start
Dec  6 04:58:30 np0005548918 multipathd[211060]: Running command: '/usr/sbin/multipathd -d'
Dec  6 04:58:30 np0005548918 multipathd[211060]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  6 04:58:30 np0005548918 multipathd[211060]: + umask 0022
Dec  6 04:58:30 np0005548918 multipathd[211060]: + exec /usr/sbin/multipathd -d
Dec  6 04:58:30 np0005548918 systemd[1]: 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21-63ba761035b21724.service: Main process exited, code=exited, status=1/FAILURE
Dec  6 04:58:30 np0005548918 systemd[1]: 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21-63ba761035b21724.service: Failed with result 'exit-code'.
Dec  6 04:58:30 np0005548918 multipathd[211060]: 3484.091224 | --------start up--------
Dec  6 04:58:30 np0005548918 multipathd[211060]: 3484.091243 | read /etc/multipath.conf
Dec  6 04:58:30 np0005548918 multipathd[211060]: 3484.096505 | path checkers start up
Dec  6 04:58:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:30 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:31 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:58:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:31.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:58:31 np0005548918 python3.9[211251]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:58:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:31.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:32 np0005548918 python3.9[211406]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:58:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:32 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:32 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:33 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:33 np0005548918 python3.9[211571]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:58:33 np0005548918 systemd[1]: Stopping multipathd container...
Dec  6 04:58:33 np0005548918 multipathd[211060]: 3486.970773 | exit (signal)
Dec  6 04:58:33 np0005548918 multipathd[211060]: 3486.970856 | --------shut down-------
Dec  6 04:58:33 np0005548918 systemd[1]: libpod-33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21.scope: Deactivated successfully.
Dec  6 04:58:33 np0005548918 podman[211576]: 2025-12-06 09:58:33.343620941 +0000 UTC m=+0.081008515 container died 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec  6 04:58:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:33.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:33 np0005548918 systemd[1]: 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21-63ba761035b21724.timer: Deactivated successfully.
Dec  6 04:58:33 np0005548918 systemd[1]: Stopped /usr/bin/podman healthcheck run 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21.
Dec  6 04:58:33 np0005548918 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21-userdata-shm.mount: Deactivated successfully.
Dec  6 04:58:33 np0005548918 systemd[1]: var-lib-containers-storage-overlay-fae8c3267380fb3101540188972b31cc3765922a8003751800bb0cfe5c443923-merged.mount: Deactivated successfully.
Dec  6 04:58:33 np0005548918 podman[211576]: 2025-12-06 09:58:33.7646888 +0000 UTC m=+0.502076384 container cleanup 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd)
Dec  6 04:58:33 np0005548918 podman[211576]: multipathd
Dec  6 04:58:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:33 np0005548918 podman[211604]: multipathd
Dec  6 04:58:33 np0005548918 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec  6 04:58:33 np0005548918 systemd[1]: Stopped multipathd container.
Dec  6 04:58:33 np0005548918 systemd[1]: Starting multipathd container...
Dec  6 04:58:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:33.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:33 np0005548918 systemd[1]: Started libcrun container.
Dec  6 04:58:33 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fae8c3267380fb3101540188972b31cc3765922a8003751800bb0cfe5c443923/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  6 04:58:33 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fae8c3267380fb3101540188972b31cc3765922a8003751800bb0cfe5c443923/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  6 04:58:34 np0005548918 systemd[1]: Started /usr/bin/podman healthcheck run 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21.
Dec  6 04:58:34 np0005548918 podman[211618]: 2025-12-06 09:58:34.030406621 +0000 UTC m=+0.143742188 container init 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec  6 04:58:34 np0005548918 multipathd[211633]: + sudo -E kolla_set_configs
Dec  6 04:58:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:34 np0005548918 podman[211618]: 2025-12-06 09:58:34.064897327 +0000 UTC m=+0.178232804 container start 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  6 04:58:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:34 np0005548918 podman[211618]: multipathd
Dec  6 04:58:34 np0005548918 systemd[1]: Started multipathd container.
Dec  6 04:58:34 np0005548918 multipathd[211633]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  6 04:58:34 np0005548918 multipathd[211633]: INFO:__main__:Validating config file
Dec  6 04:58:34 np0005548918 multipathd[211633]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  6 04:58:34 np0005548918 multipathd[211633]: INFO:__main__:Writing out command to execute
Dec  6 04:58:34 np0005548918 multipathd[211633]: ++ cat /run_command
Dec  6 04:58:34 np0005548918 multipathd[211633]: + CMD='/usr/sbin/multipathd -d'
Dec  6 04:58:34 np0005548918 multipathd[211633]: + ARGS=
Dec  6 04:58:34 np0005548918 multipathd[211633]: + sudo kolla_copy_cacerts
Dec  6 04:58:34 np0005548918 multipathd[211633]: + [[ ! -n '' ]]
Dec  6 04:58:34 np0005548918 multipathd[211633]: + . kolla_extend_start
Dec  6 04:58:34 np0005548918 multipathd[211633]: Running command: '/usr/sbin/multipathd -d'
Dec  6 04:58:34 np0005548918 multipathd[211633]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  6 04:58:34 np0005548918 multipathd[211633]: + umask 0022
Dec  6 04:58:34 np0005548918 multipathd[211633]: + exec /usr/sbin/multipathd -d
Dec  6 04:58:34 np0005548918 podman[211640]: 2025-12-06 09:58:34.154144922 +0000 UTC m=+0.071611493 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  6 04:58:34 np0005548918 systemd[1]: 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21-7e92520df872aac6.service: Main process exited, code=exited, status=1/FAILURE
Dec  6 04:58:34 np0005548918 systemd[1]: 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21-7e92520df872aac6.service: Failed with result 'exit-code'.
Dec  6 04:58:34 np0005548918 multipathd[211633]: 3487.832703 | --------start up--------
Dec  6 04:58:34 np0005548918 multipathd[211633]: 3487.832723 | read /etc/multipath.conf
Dec  6 04:58:34 np0005548918 multipathd[211633]: 3487.839495 | path checkers start up
Dec  6 04:58:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:34 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f54480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:34 np0005548918 python3.9[211823]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:34 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:35 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:35.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:35.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:35 np0005548918 python3.9[211976]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  6 04:58:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:36 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:36 np0005548918 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec  6 04:58:36 np0005548918 systemd[1]: virtqemud.service: Deactivated successfully.
Dec  6 04:58:36 np0005548918 python3.9[212129]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec  6 04:58:36 np0005548918 kernel: Key type psk registered
Dec  6 04:58:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:36 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f54480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:37 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:37.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:37 np0005548918 python3.9[212293]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:58:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:37.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:38 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:38 np0005548918 python3.9[212417]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015117.224801-1852-238204621761993/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:38 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:39 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:39 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:39.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:39 np0005548918 python3.9[212570]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:39.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.207165) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120207266, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1335, "num_deletes": 256, "total_data_size": 3266759, "memory_usage": 3311880, "flush_reason": "Manual Compaction"}
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120237096, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2138927, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18240, "largest_seqno": 19569, "table_properties": {"data_size": 2133279, "index_size": 3039, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11449, "raw_average_key_size": 18, "raw_value_size": 2121946, "raw_average_value_size": 3455, "num_data_blocks": 137, "num_entries": 614, "num_filter_entries": 614, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015003, "oldest_key_time": 1765015003, "file_creation_time": 1765015120, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 29965 microseconds, and 9267 cpu microseconds.
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.237166) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2138927 bytes OK
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.237221) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.239573) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.239596) EVENT_LOG_v1 {"time_micros": 1765015120239588, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.239621) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3260494, prev total WAL file size 3260494, number of live WAL files 2.
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.241212) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323533' seq:0, type:0; will stop at (end)
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2088KB)], [33(11MB)]
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120241258, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 14472935, "oldest_snapshot_seqno": -1}
Dec  6 04:58:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:40 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4957 keys, 13987206 bytes, temperature: kUnknown
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120411988, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13987206, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13952357, "index_size": 21363, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12421, "raw_key_size": 126142, "raw_average_key_size": 25, "raw_value_size": 13860673, "raw_average_value_size": 2796, "num_data_blocks": 876, "num_entries": 4957, "num_filter_entries": 4957, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765015120, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.412228) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13987206 bytes
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.415103) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 84.7 rd, 81.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 11.8 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(13.3) write-amplify(6.5) OK, records in: 5483, records dropped: 526 output_compression: NoCompression
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.415219) EVENT_LOG_v1 {"time_micros": 1765015120415159, "job": 18, "event": "compaction_finished", "compaction_time_micros": 170793, "compaction_time_cpu_micros": 50623, "output_level": 6, "num_output_files": 1, "total_output_size": 13987206, "num_input_records": 5483, "num_output_records": 4957, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120416221, "job": 18, "event": "table_file_deletion", "file_number": 35}
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120419619, "job": 18, "event": "table_file_deletion", "file_number": 33}
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.241096) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.419770) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.419780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.419782) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.419785) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:58:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-09:58:40.419786) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 04:58:40 np0005548918 python3.9[212723]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:58:40 np0005548918 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  6 04:58:40 np0005548918 systemd[1]: Stopped Load Kernel Modules.
Dec  6 04:58:40 np0005548918 systemd[1]: Stopping Load Kernel Modules...
Dec  6 04:58:40 np0005548918 systemd[1]: Starting Load Kernel Modules...
Dec  6 04:58:40 np0005548918 systemd[1]: Finished Load Kernel Modules.
Dec  6 04:58:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:40 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:41 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:41.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:41 np0005548918 python3.9[212880]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  6 04:58:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:41.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:42 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:42 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:43 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:58:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:43.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:58:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:43.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:44 np0005548918 systemd[1]: Reloading.
Dec  6 04:58:44 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:44 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:44 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:44 np0005548918 systemd[1]: Reloading.
Dec  6 04:58:44 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:44 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:44 np0005548918 systemd-logind[800]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  6 04:58:44 np0005548918 systemd-logind[800]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  6 04:58:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:44 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:44 np0005548918 lvm[212997]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 04:58:44 np0005548918 lvm[212997]: VG ceph_vg0 finished
Dec  6 04:58:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:45 np0005548918 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  6 04:58:45 np0005548918 systemd[1]: Starting man-db-cache-update.service...
Dec  6 04:58:45 np0005548918 systemd[1]: Reloading.
Dec  6 04:58:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:45 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:45 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:45 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:45.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:45 np0005548918 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  6 04:58:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:45.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:46 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:46 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:47 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:47 np0005548918 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  6 04:58:47 np0005548918 systemd[1]: Finished man-db-cache-update.service.
Dec  6 04:58:47 np0005548918 systemd[1]: man-db-cache-update.service: Consumed 1.778s CPU time.
Dec  6 04:58:47 np0005548918 systemd[1]: run-r33c60536bf15419286b818ec96450bf2.service: Deactivated successfully.
Dec  6 04:58:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:47.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:47 np0005548918 python3.9[214342]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 04:58:47 np0005548918 systemd[1]: Stopping Open-iSCSI...
Dec  6 04:58:47 np0005548918 iscsid[202303]: iscsid shutting down.
Dec  6 04:58:47 np0005548918 systemd[1]: iscsid.service: Deactivated successfully.
Dec  6 04:58:47 np0005548918 systemd[1]: Stopped Open-iSCSI.
Dec  6 04:58:47 np0005548918 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  6 04:58:47 np0005548918 systemd[1]: Starting Open-iSCSI...
Dec  6 04:58:47 np0005548918 systemd[1]: Started Open-iSCSI.
Dec  6 04:58:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:47.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:48 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:48 np0005548918 python3.9[214497]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  6 04:58:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:48 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:49 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:49 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:49.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:49 np0005548918 python3.9[214679]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:58:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:49.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:50 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:50 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:50 np0005548918 python3.9[214832]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:58:51 np0005548918 systemd[1]: Reloading.
Dec  6 04:58:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:51 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:58:51 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:58:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:51 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:51.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:51 np0005548918 podman[214869]: 2025-12-06 09:58:51.505685064 +0000 UTC m=+0.153181941 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 04:58:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:58:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:51.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:58:52 np0005548918 python3.9[215045]: ansible-ansible.builtin.service_facts Invoked
Dec  6 04:58:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:52 np0005548918 network[215062]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  6 04:58:52 np0005548918 network[215063]: 'network-scripts' will be removed from distribution in near future.
Dec  6 04:58:52 np0005548918 network[215064]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  6 04:58:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:52 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:52 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:53 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:53.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:58:53.664 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:58:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:58:53.665 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:58:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:58:53.665 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:58:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:58:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:53.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:58:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:54 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:54 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:54 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:55 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:55.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:55.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:56 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:56 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:57.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:58:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:57.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:58 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:58 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:58:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:59 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:58:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:58:59 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:58:59 np0005548918 podman[215346]: 2025-12-06 09:58:59.283282888 +0000 UTC m=+0.088623050 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  6 04:58:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:58:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:59.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:58:59 np0005548918 python3.9[215347]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:58:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:58:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:58:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:58:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:58:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:59.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:00 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:00 np0005548918 python3.9[215518]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:59:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:00 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:00 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:01 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:01 np0005548918 python3.9[215672]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:59:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:01.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:01 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:01.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:02 np0005548918 python3.9[215826]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:59:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:02 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c0014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:02 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:02 np0005548918 python3.9[215981]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:59:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:02 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:03 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:03.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:03 np0005548918 python3.9[216135]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:59:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:03 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:59:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:03.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:59:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:04 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:04 np0005548918 podman[216261]: 2025-12-06 09:59:04.29806384 +0000 UTC m=+0.071613123 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  6 04:59:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:04 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:04 np0005548918 python3.9[216307]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:59:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:04 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:04 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:05 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:05 np0005548918 python3.9[216463]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 04:59:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:05.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:05 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:59:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:05.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:59:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:06 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:06 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 04:59:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 04:59:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:06 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:06 np0005548918 python3.9[216700]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:07 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:59:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:07.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:59:07 np0005548918 python3.9[216853]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:07 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:59:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:07.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:59:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:08 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:08 np0005548918 python3.9[217006]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:08 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:08 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f54740027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:09 np0005548918 python3.9[217158]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:09 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:09 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:09.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:09 np0005548918 python3.9[217311]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:09 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:09.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:10 np0005548918 python3.9[217489]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:10 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:10 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:10 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:11 np0005548918 python3.9[217641]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:11 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f54740027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:59:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:11.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:59:11 np0005548918 python3.9[217794]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:11 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:11.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:12 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:12 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:12 np0005548918 python3.9[217972]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:12 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:13 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:13 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 04:59:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:13 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:13.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:13 np0005548918 python3.9[218126]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:13 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:13.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:14 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:14 np0005548918 python3.9[218279]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:14 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:14 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:14 np0005548918 python3.9[218431]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:14 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:15 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:59:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:15.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:59:15 np0005548918 python3.9[218584]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:15 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:15.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:16 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:16 np0005548918 python3.9[218737]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:16 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:16 np0005548918 python3.9[218889]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:16 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f54440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:17 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:17.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:17 np0005548918 python3.9[219042]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:17 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:17.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:18 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:18 np0005548918 python3.9[219195]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:18 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:18 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:19 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:19 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f54440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:59:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:19.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:59:19 np0005548918 python3.9[219348]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  6 04:59:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:19 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:19.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:20 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:20 np0005548918 python3.9[219501]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 04:59:20 np0005548918 systemd[1]: Reloading.
Dec  6 04:59:20 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 04:59:20 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 04:59:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:20 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:20 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:21 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:21.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:21 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:21 np0005548918 podman[219661]: 2025-12-06 09:59:21.834583267 +0000 UTC m=+0.116253281 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  6 04:59:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:21.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:21 np0005548918 python3.9[219708]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:22 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f54440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:22 np0005548918 python3.9[219870]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:22 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:22 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:23 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:23 np0005548918 python3.9[220023]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:23.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:23 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:23 np0005548918 python3.9[220177]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:23.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:24 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:24 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:24 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:24 np0005548918 python3.9[220331]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:24 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:25 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:25.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:25 np0005548918 python3.9[220485]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:25 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:25.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:26 np0005548918 python3.9[220639]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:26 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:26 np0005548918 python3.9[220792]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  6 04:59:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:26 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:26 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:27 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:59:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:27.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:59:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:27 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:27.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:28 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:28 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:28 np0005548918 python3.9[220947]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:28 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:29 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:29.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:29 np0005548918 python3.9[221100]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:29 np0005548918 podman[221101]: 2025-12-06 09:59:29.56295878 +0000 UTC m=+0.049783967 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 04:59:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:29 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:59:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:29.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:59:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:30 np0005548918 python3.9[221297]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:30 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003e90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:30 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:30 np0005548918 python3.9[221449]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:30 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:31 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:59:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:31.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:59:31 np0005548918 python3.9[221602]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:31 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:31.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:32 np0005548918 python3.9[221755]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:32 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:32 np0005548918 python3.9[221907]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:32 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:32 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:33 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:33.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:33 np0005548918 python3.9[222060]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:33 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:59:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:33.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:59:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:34 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:34 np0005548918 python3.9[222213]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:34 np0005548918 podman[222214]: 2025-12-06 09:59:34.482021814 +0000 UTC m=+0.082180927 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Dec  6 04:59:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:34 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:35 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:35 np0005548918 python3.9[222386]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:35 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:35.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:35 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:35.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:36 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:36 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:37 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:37 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:59:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:37.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:59:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:37 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:37.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:38 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:38 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:39 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:39 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:39 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:39.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:39 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:39.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:40 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:40 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:41 np0005548918 python3.9[222544]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec  6 04:59:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:41 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:41 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:41.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:41 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:41.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:42 np0005548918 python3.9[222699]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  6 04:59:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:42 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:42 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:43 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:43 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:43 np0005548918 python3.9[222857]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  6 04:59:43 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:59:43 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 04:59:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:43.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:43 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:59:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:43.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:59:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:44 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:44 np0005548918 systemd-logind[800]: New session 54 of user zuul.
Dec  6 04:59:44 np0005548918 systemd[1]: Started Session 54 of User zuul.
Dec  6 04:59:44 np0005548918 systemd[1]: session-54.scope: Deactivated successfully.
Dec  6 04:59:44 np0005548918 systemd-logind[800]: Session 54 logged out. Waiting for processes to exit.
Dec  6 04:59:44 np0005548918 systemd-logind[800]: Removed session 54.
Dec  6 04:59:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:44 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:45 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:45 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:45 np0005548918 python3.9[223048]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:59:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:45.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:59:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:45 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 04:59:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:45.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 04:59:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:46 np0005548918 python3.9[223170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015184.8171701-3437-234496400676343/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:46 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003f80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:46 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:46 np0005548918 python3.9[223320]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:47 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:47 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:47 np0005548918 python3.9[223397]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:59:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:47.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:59:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:47 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:59:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:47.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:59:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:48 np0005548918 python3.9[223548]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:48 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:48 np0005548918 python3.9[223669]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015187.565002-3437-143912129281962/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:48 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:49 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:49 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:49 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:49 np0005548918 python3.9[223820]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:59:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:49.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:59:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:49 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:49 np0005548918 python3.9[223941]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015188.8603592-3437-114664162115620/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:49.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:50 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:50 np0005548918 python3.9[224117]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:50 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/095950 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 04:59:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:51 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:51 np0005548918 python3.9[224238]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015190.0352015-3437-233612416498987/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:51 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:51.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:51 np0005548918 python3.9[224389]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:51 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:51.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:52 np0005548918 podman[224485]: 2025-12-06 09:59:52.183649212 +0000 UTC m=+0.131382146 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec  6 04:59:52 np0005548918 python3.9[224523]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015191.234972-3437-40238444859026/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:52 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:52 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:53 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:53 np0005548918 python3.9[224689]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:53 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:53.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:59:53.665 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 04:59:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:59:53.666 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 04:59:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 09:59:53.666 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 04:59:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:53 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:53 np0005548918 python3.9[224842]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 04:59:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:53.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:54 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:54 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:54 np0005548918 python3.9[224995]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:59:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:54 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:55 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:55 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:55.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:55 np0005548918 python3.9[225148]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:55 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:59:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:55.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:59:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:56 np0005548918 python3.9[225272]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1765015194.957098-3759-106755372697104/.source _original_basename=.d1snky8h follow=False checksum=7d47df32be8963eb7e550d9c60475b85d6544abc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec  6 04:59:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:56 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:56 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:57 np0005548918 python3.9[225425]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 04:59:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:57.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:57 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:57 np0005548918 python3.9[225578]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 04:59:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:57.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 04:59:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:58 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:58 np0005548918 python3.9[225700]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015197.3443687-3835-277094895190287/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=81f1f28d070b2613355f782b83a5777fdba9540e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:58 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:59 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 09:59:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:59 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 04:59:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:59 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 04:59:59 np0005548918 python3.9[225850]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  6 04:59:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 04:59:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 04:59:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:59.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 04:59:59 np0005548918 python3.9[225972]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015198.761653-3880-164823127845701/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=2efe6ae78bce1c26d2c384be079fa366810076ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  6 04:59:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 09:59:59 2025: (VI_0) received an invalid passwd!
Dec  6 04:59:59 np0005548918 podman[225973]: 2025-12-06 09:59:59.854409749 +0000 UTC m=+0.069069095 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  6 04:59:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 09:59:59 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:00:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:00:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:00.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:00:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:00 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:00 np0005548918 ceph-mon[75798]: overall HEALTH_OK
Dec  6 05:00:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:00 np0005548918 python3.9[226144]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec  6 05:00:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:01 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:01 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:01.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:01 np0005548918 python3.9[226297]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  6 05:00:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:02.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:02 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:02 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:00:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:02 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:00:02 np0005548918 python3[226450]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec  6 05:00:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:03 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:03 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:03.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:04.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:04 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:00:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:04 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:05 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:05 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:05 np0005548918 podman[226489]: 2025-12-06 10:00:05.187870473 +0000 UTC m=+0.067776740 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  6 05:00:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:05.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:05 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:00:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:06.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:06 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:07 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:07 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:07.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:08.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:08 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:09 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:09 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:00:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:09 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:00:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:09.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:00:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:00:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:10.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:00:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:10 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:00:10 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.5 total, 600.0 interval#012Cumulative writes: 3688 writes, 20K keys, 3688 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.05 MB/s#012Cumulative WAL: 3687 writes, 3687 syncs, 1.00 writes per sync, written: 0.05 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1429 writes, 6564 keys, 1429 commit groups, 1.0 writes per commit group, ingest: 16.20 MB, 0.03 MB/s#012Interval WAL: 1428 writes, 1428 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     35.8      0.81              0.09         9    0.090       0      0       0.0       0.0#012  L6      1/0   13.34 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.6     92.2     80.2      1.30              0.28         8    0.162     38K   4140       0.0       0.0#012 Sum      1/0   13.34 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.6     56.8     63.1      2.11              0.36        17    0.124     38K   4140       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   6.6     81.9     81.8      0.55              0.13         6    0.092     16K   1857       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0     92.2     80.2      1.30              0.28         8    0.162     38K   4140       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     64.5      0.45              0.09         8    0.056       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.36              0.00         1    0.360       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.5 total, 600.0 interval#012Flush(GB): cumulative 0.028, interval 0.007#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.13 GB write, 0.11 MB/s write, 0.12 GB read, 0.10 MB/s read, 2.1 seconds#012Interval compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.08 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55784c777350#2 capacity: 304.00 MB usage: 5.00 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(268,4.67 MB,1.53559%) FilterBlock(17,116.36 KB,0.037379%) IndexBlock(17,221.86 KB,0.0712696%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  6 05:00:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:10 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:11 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:11 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:11.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:12.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:12 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100013 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:00:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:13 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:13 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:00:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:13.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:00:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:14.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:14 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:14 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:00:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:15 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:15 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:15.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:00:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:16.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:00:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:16 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:17 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:17 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:17.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:17 np0005548918 podman[226463]: 2025-12-06 10:00:17.654483058 +0000 UTC m=+14.658300101 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5
Dec  6 05:00:17 np0005548918 podman[226742]: 2025-12-06 10:00:17.81517962 +0000 UTC m=+0.050881456 container create 49753e8c51e19d45c484ca951432b85368bb904bf83e249539a4d053f674edeb (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  6 05:00:17 np0005548918 podman[226742]: 2025-12-06 10:00:17.78762384 +0000 UTC m=+0.023325696 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5
Dec  6 05:00:17 np0005548918 python3[226450]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5 bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec  6 05:00:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:18.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:18 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:18 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:00:18 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:00:18 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:00:18 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:00:18 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:00:18 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:00:18 np0005548918 python3.9[226965]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 05:00:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:19 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:19 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:00:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:19.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:00:19 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:00:19 np0005548918 python3.9[227120]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec  6 05:00:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:20.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:20 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c002130 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:20 np0005548918 python3.9[227273]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  6 05:00:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:21 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:21 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:00:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:21.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:00:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:00:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:22.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:00:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:22 np0005548918 python3[227427]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec  6 05:00:22 np0005548918 podman[227464]: 2025-12-06 10:00:22.389296998 +0000 UTC m=+0.054525735 container create e606f831b872fe1428074d2a7ac6e3bb3b9019488f94059468e11bd866eff6d8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:00:22 np0005548918 podman[227464]: 2025-12-06 10:00:22.365059777 +0000 UTC m=+0.030288544 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5
Dec  6 05:00:22 np0005548918 python3[227427]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5 kolla_start
Dec  6 05:00:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:22 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:23 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:23 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:23 np0005548918 podman[227625]: 2025-12-06 10:00:23.213920848 +0000 UTC m=+0.108396206 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  6 05:00:23 np0005548918 python3.9[227671]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 05:00:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:00:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:23.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:00:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:00:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:24.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:00:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:24 np0005548918 python3.9[227833]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 05:00:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:24 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:24 np0005548918 python3.9[228009]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765015224.1818027-4155-20368122483184/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  6 05:00:24 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:00:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:24 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:00:24 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:00:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:25 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:25 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:25 np0005548918 python3.9[228085]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  6 05:00:25 np0005548918 systemd[1]: Reloading.
Dec  6 05:00:25 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 05:00:25 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 05:00:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:25.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:00:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:26.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:00:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:26 np0005548918 python3.9[228198]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  6 05:00:26 np0005548918 systemd[1]: Reloading.
Dec  6 05:00:26 np0005548918 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  6 05:00:26 np0005548918 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  6 05:00:26 np0005548918 ceph-osd[78376]: bluestore.MempoolThread fragmentation_score=0.000025 took=0.000037s
Dec  6 05:00:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:26 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:26 np0005548918 systemd[1]: Starting nova_compute container...
Dec  6 05:00:26 np0005548918 systemd[1]: Started libcrun container.
Dec  6 05:00:26 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd86534d7c9d82a41ff055d3f6da2d7f82aa5de9c3960bcfdaf58572bde0bfbf/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:26 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd86534d7c9d82a41ff055d3f6da2d7f82aa5de9c3960bcfdaf58572bde0bfbf/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:26 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd86534d7c9d82a41ff055d3f6da2d7f82aa5de9c3960bcfdaf58572bde0bfbf/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:26 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd86534d7c9d82a41ff055d3f6da2d7f82aa5de9c3960bcfdaf58572bde0bfbf/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:26 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd86534d7c9d82a41ff055d3f6da2d7f82aa5de9c3960bcfdaf58572bde0bfbf/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:26 np0005548918 podman[228238]: 2025-12-06 10:00:26.734601709 +0000 UTC m=+0.099909065 container init e606f831b872fe1428074d2a7ac6e3bb3b9019488f94059468e11bd866eff6d8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS)
Dec  6 05:00:26 np0005548918 podman[228238]: 2025-12-06 10:00:26.740376596 +0000 UTC m=+0.105683932 container start e606f831b872fe1428074d2a7ac6e3bb3b9019488f94059468e11bd866eff6d8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, org.label-schema.build-date=20251125, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible)
Dec  6 05:00:26 np0005548918 podman[228238]: nova_compute
Dec  6 05:00:26 np0005548918 nova_compute[228253]: + sudo -E kolla_set_configs
Dec  6 05:00:26 np0005548918 systemd[1]: Started nova_compute container.
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Validating config file
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Copying service configuration files
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Deleting /etc/ceph
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Creating directory /etc/ceph
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /etc/ceph
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Writing out command to execute
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:26 np0005548918 nova_compute[228253]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  6 05:00:26 np0005548918 nova_compute[228253]: ++ cat /run_command
Dec  6 05:00:26 np0005548918 nova_compute[228253]: + CMD=nova-compute
Dec  6 05:00:26 np0005548918 nova_compute[228253]: + ARGS=
Dec  6 05:00:26 np0005548918 nova_compute[228253]: + sudo kolla_copy_cacerts
Dec  6 05:00:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:26 np0005548918 nova_compute[228253]: + [[ ! -n '' ]]
Dec  6 05:00:26 np0005548918 nova_compute[228253]: + . kolla_extend_start
Dec  6 05:00:26 np0005548918 nova_compute[228253]: Running command: 'nova-compute'
Dec  6 05:00:26 np0005548918 nova_compute[228253]: + echo 'Running command: '\''nova-compute'\'''
Dec  6 05:00:26 np0005548918 nova_compute[228253]: + umask 0022
Dec  6 05:00:26 np0005548918 nova_compute[228253]: + exec nova-compute
Dec  6 05:00:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:27 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:27 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:27.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:27 np0005548918 python3.9[228416]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 05:00:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:00:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:28.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:00:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:28 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:28 np0005548918 python3.9[228568]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 05:00:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:29 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:29 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444002550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:29.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:29 np0005548918 nova_compute[228253]: 2025-12-06 10:00:29.536 228257 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  6 05:00:29 np0005548918 nova_compute[228253]: 2025-12-06 10:00:29.537 228257 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  6 05:00:29 np0005548918 nova_compute[228253]: 2025-12-06 10:00:29.537 228257 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  6 05:00:29 np0005548918 nova_compute[228253]: 2025-12-06 10:00:29.537 228257 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.651800) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229651862, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1268, "num_deletes": 251, "total_data_size": 3255315, "memory_usage": 3290288, "flush_reason": "Manual Compaction"}
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229666769, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 2113460, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19574, "largest_seqno": 20837, "table_properties": {"data_size": 2107863, "index_size": 2989, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11905, "raw_average_key_size": 19, "raw_value_size": 2096729, "raw_average_value_size": 3506, "num_data_blocks": 132, "num_entries": 598, "num_filter_entries": 598, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015121, "oldest_key_time": 1765015121, "file_creation_time": 1765015229, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 15020 microseconds, and 5059 cpu microseconds.
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.666823) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 2113460 bytes OK
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.666845) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.670747) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.670819) EVENT_LOG_v1 {"time_micros": 1765015229670806, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.670856) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 3249272, prev total WAL file size 3249272, number of live WAL files 2.
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.672083) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(2063KB)], [36(13MB)]
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229672137, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 16100666, "oldest_snapshot_seqno": -1}
Dec  6 05:00:29 np0005548918 nova_compute[228253]: 2025-12-06 10:00:29.733 228257 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:00:29 np0005548918 nova_compute[228253]: 2025-12-06 10:00:29.757 228257 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:00:29 np0005548918 nova_compute[228253]: 2025-12-06 10:00:29.758 228257 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 5035 keys, 13954516 bytes, temperature: kUnknown
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229794081, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 13954516, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13919263, "index_size": 21575, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12613, "raw_key_size": 128319, "raw_average_key_size": 25, "raw_value_size": 13826225, "raw_average_value_size": 2746, "num_data_blocks": 885, "num_entries": 5035, "num_filter_entries": 5035, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765015229, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.794336) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 13954516 bytes
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.818583) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 132.0 rd, 114.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 13.3 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(14.2) write-amplify(6.6) OK, records in: 5555, records dropped: 520 output_compression: NoCompression
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.818623) EVENT_LOG_v1 {"time_micros": 1765015229818608, "job": 20, "event": "compaction_finished", "compaction_time_micros": 122013, "compaction_time_cpu_micros": 28302, "output_level": 6, "num_output_files": 1, "total_output_size": 13954516, "num_input_records": 5555, "num_output_records": 5035, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229819113, "job": 20, "event": "table_file_deletion", "file_number": 38}
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229821450, "job": 20, "event": "table_file_deletion", "file_number": 36}
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.671990) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.821490) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.821495) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.821497) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.821498) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:00:29.821500) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:00:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:00:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:29 np0005548918 python3.9[228721]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  6 05:00:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:00:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:30.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:00:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:30 np0005548918 podman[228773]: 2025-12-06 10:00:30.083990137 +0000 UTC m=+0.046037672 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent)
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.222 228257 INFO nova.virt.driver [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.344 228257 INFO nova.compute.provider_config [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  6 05:00:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:30 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.421 228257 DEBUG oslo_concurrency.lockutils [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.422 228257 DEBUG oslo_concurrency.lockutils [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.422 228257 DEBUG oslo_concurrency.lockutils [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.423 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.423 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.423 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.423 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.423 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.424 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.424 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.424 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.424 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.424 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.425 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.425 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.425 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.425 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.426 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.426 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.426 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.426 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.426 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.427 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.427 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.427 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.427 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.427 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.428 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.428 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.428 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.428 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.429 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.429 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.429 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.429 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.429 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.430 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.430 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.430 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.430 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.431 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.431 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.431 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.431 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.431 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.432 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.432 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.432 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.432 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.432 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.433 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.433 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.433 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.433 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.433 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.434 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.434 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.434 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.434 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.434 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.435 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.435 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.435 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.435 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.435 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.436 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.436 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.436 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.436 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.436 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.436 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.437 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.437 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.437 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.437 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.437 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.438 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.438 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.438 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.438 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.438 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.439 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.439 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.439 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.439 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.439 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.439 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.440 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.440 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.440 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.440 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.440 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.441 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.441 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.441 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.441 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.441 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.441 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.442 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.442 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.442 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.442 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.442 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.442 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.443 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.443 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.443 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.443 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.443 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.443 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.443 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.444 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.444 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.444 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.444 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.444 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.444 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.445 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.445 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.445 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.445 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.445 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.445 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.445 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.446 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.446 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.446 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.446 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.446 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.446 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.446 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.447 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.447 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.447 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.447 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.447 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.447 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.447 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.448 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.448 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.448 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.448 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.448 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.448 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.449 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.449 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.449 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.449 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.449 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.449 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.449 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.450 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.450 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.450 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.450 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.450 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.450 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.450 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.451 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.451 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.451 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.451 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.451 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.451 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.452 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.452 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.452 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.452 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.452 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.452 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.452 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.453 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.453 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.453 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.453 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.453 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.453 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.454 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.454 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.454 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.454 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.454 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.455 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.455 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.455 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.455 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.455 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.455 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.456 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.456 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.456 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.456 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.456 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.456 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.457 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.457 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.457 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.457 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.457 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.457 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.457 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.458 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.458 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.458 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.458 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.458 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.458 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.459 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.459 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.459 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.459 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.459 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.459 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.460 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.460 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.460 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.460 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.460 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.461 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.461 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.461 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.461 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.461 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.462 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.462 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.462 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.462 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.462 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.463 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.463 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.463 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.463 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.463 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.464 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.464 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.464 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.464 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.464 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.464 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.465 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.465 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.465 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.465 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.465 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.465 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.466 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.466 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.466 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.466 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.466 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.466 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.466 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.466 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.467 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.467 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.467 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.467 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.467 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.467 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.468 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.468 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.468 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.468 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.468 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.468 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.468 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.469 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.469 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.469 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.469 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.469 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.470 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.470 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.470 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.470 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.470 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.470 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.470 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.471 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.471 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.471 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.471 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.471 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.471 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.472 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.472 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.472 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.472 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.472 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.472 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.472 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.473 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.473 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.473 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.473 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.473 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.473 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.473 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.474 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.474 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.474 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.474 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.474 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.474 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.474 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.475 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.475 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.475 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.475 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.475 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.475 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.475 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.475 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.476 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.476 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.476 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.476 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.476 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.476 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.477 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.477 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.477 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.477 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.477 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.477 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.477 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.478 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.478 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.478 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.478 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.478 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.478 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.478 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.479 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.479 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.479 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.479 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.479 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.479 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.479 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.480 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.480 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.480 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.480 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.480 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.480 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.480 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.481 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.481 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.481 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.481 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.481 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.481 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.481 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.482 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.482 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.482 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.482 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.482 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.482 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.483 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.483 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.483 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.483 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.483 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.483 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.483 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.484 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.484 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.484 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.484 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.484 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.484 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.484 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.485 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.485 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.485 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.485 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.485 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.485 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.485 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.486 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.486 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.486 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.486 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.486 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.486 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.486 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.487 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.487 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.487 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.487 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.487 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.487 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.488 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.488 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.488 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.488 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.488 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.488 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.489 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.489 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.489 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.489 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.489 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.489 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.490 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.490 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.490 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.490 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.490 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.490 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.491 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.491 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.491 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.491 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.491 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.491 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.492 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.492 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.492 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.492 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.492 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.492 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.493 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.493 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.493 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.493 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.493 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.493 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.494 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.494 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.494 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.494 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.494 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.495 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.495 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.495 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.495 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.495 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.495 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.496 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.496 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.496 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.496 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.496 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.497 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.497 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.497 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.497 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.497 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.498 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.498 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.498 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.498 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.498 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.499 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.499 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.499 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.499 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.499 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.499 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.500 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.500 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.500 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.500 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.500 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.500 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.500 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.501 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.501 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.501 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.501 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.502 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.502 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.502 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.502 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.502 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.503 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.503 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.503 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.503 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.503 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.503 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.504 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.504 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.504 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.504 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.504 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.505 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.505 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.505 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.505 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.505 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.506 228257 WARNING oslo_config.cfg [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  6 05:00:30 np0005548918 nova_compute[228253]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  6 05:00:30 np0005548918 nova_compute[228253]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  6 05:00:30 np0005548918 nova_compute[228253]: and ``live_migration_inbound_addr`` respectively.
Dec  6 05:00:30 np0005548918 nova_compute[228253]: ).  Its value may be silently ignored in the future.#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.506 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.506 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.506 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.507 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.507 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.507 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.507 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.507 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.508 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.508 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.508 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.508 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.508 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.509 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.509 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.509 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.509 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.509 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.509 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.rbd_secret_uuid        = 5ecd3f74-dade-5fc4-92ce-8950ae424258 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.510 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.510 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.510 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.510 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.511 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.511 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.511 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.511 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.511 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.511 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.512 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.512 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.512 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.512 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.513 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.513 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.513 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.513 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.513 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.513 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.514 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.514 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.514 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.514 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.514 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.514 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.515 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.515 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.515 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.515 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.515 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.516 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.516 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.516 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.516 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.516 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.517 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.517 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.517 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.517 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.517 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.517 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.517 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.517 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.518 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.518 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.518 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.518 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.518 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.518 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.518 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.519 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.519 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.519 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.519 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.519 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.519 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.519 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.520 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.520 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.520 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.520 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.520 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.520 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.520 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.521 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.521 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.521 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.521 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.521 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.521 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.521 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.522 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.522 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.522 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.522 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.522 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.522 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.523 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.523 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.523 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.523 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.523 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.523 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.523 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.524 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.524 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.524 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.524 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.524 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.524 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.524 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.525 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.525 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.525 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.525 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.525 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.525 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.525 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.525 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.526 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.526 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.526 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.526 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.526 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.526 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.526 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.527 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.527 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.527 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.527 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.527 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.527 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.527 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.528 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.528 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.528 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.528 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.528 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.528 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.529 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.529 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.529 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.529 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.529 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.529 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.530 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.530 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.530 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.530 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.530 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.530 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.530 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.530 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.531 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.531 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.531 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.531 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.531 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.531 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.532 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.532 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.532 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.532 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.532 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.532 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.533 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.533 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.533 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.533 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.533 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.533 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.533 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.534 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.534 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.534 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.534 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.534 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.534 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.535 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.535 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.535 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.535 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.535 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.535 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.536 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.536 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.536 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.536 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.536 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.536 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.536 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.537 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.537 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.537 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.537 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.537 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.537 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.538 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.538 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.538 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.538 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.538 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.538 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.538 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.539 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.539 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.539 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.539 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.539 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.539 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.539 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.540 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.540 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.540 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.540 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.540 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.540 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.541 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.541 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.541 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.541 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.541 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.541 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.541 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.542 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.542 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.542 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.542 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.542 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.542 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.542 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.543 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.543 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.543 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.543 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.543 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.543 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.543 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.544 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.544 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.544 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.544 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.544 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.544 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.545 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.545 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.545 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.545 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.545 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.546 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.546 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.546 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.546 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.546 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.546 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.547 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.547 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.547 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.547 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.547 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.547 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.547 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.548 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.548 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.548 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.548 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.548 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.548 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.549 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.549 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.549 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.549 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.549 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.549 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.549 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.550 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.550 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.550 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.550 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.550 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.550 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.550 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.551 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.551 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.551 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.551 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.551 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.551 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.551 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.552 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.552 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.552 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.552 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.552 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.553 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.553 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.553 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.553 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.553 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.553 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.553 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.554 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.554 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.554 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.554 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.554 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.554 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.555 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.555 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.555 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.555 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.555 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.555 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.555 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.556 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.556 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.556 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.556 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.556 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.556 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.557 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.557 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.557 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.557 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.557 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.557 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.558 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.558 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.558 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.558 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.558 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.558 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.559 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.559 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.559 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.559 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.559 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.559 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.560 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.560 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.560 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.560 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.561 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.561 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.561 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.561 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.561 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.561 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.562 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.562 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.562 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.562 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.562 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.562 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.563 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.563 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.563 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.563 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.563 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.563 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.563 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.564 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.564 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.564 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.564 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.564 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.565 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.565 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.565 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.566 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.566 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.566 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.566 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.566 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.566 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.567 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.567 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.567 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.567 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.567 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.567 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.568 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.568 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.568 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.568 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.568 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.568 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.568 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.569 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.569 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.569 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.569 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.569 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.569 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.570 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.570 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.570 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.570 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.570 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.570 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.570 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.571 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.571 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.571 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.571 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.571 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.572 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.572 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.572 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.572 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.572 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.573 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.573 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.573 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.573 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.573 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.573 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.574 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.574 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.574 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.574 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.574 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.574 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.574 228257 DEBUG oslo_service.service [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.576 228257 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.592 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.593 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.593 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.593 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  6 05:00:30 np0005548918 systemd[1]: Starting libvirt QEMU daemon...
Dec  6 05:00:30 np0005548918 systemd[1]: Started libvirt QEMU daemon.
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.689 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fd108bd9550> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.692 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fd108bd9550> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.694 228257 INFO nova.virt.libvirt.driver [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.714 228257 WARNING nova.virt.libvirt.driver [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Dec  6 05:00:30 np0005548918 nova_compute[228253]: 2025-12-06 10:00:30.714 228257 DEBUG nova.virt.libvirt.volume.mount [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  6 05:00:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:31 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:31 np0005548918 python3.9[228972]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  6 05:00:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:31 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 05:00:31 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 05:00:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:31 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:31.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.525 228257 INFO nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Libvirt host capabilities <capabilities>
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <host>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <uuid>9abb64ab-b4a8-4a69-9492-3018ef71c6f2</uuid>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <cpu>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <arch>x86_64</arch>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model>EPYC-Rome-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <vendor>AMD</vendor>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <microcode version='16777317'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <signature family='23' model='49' stepping='0'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='x2apic'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='tsc-deadline'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='osxsave'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='hypervisor'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='tsc_adjust'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='spec-ctrl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='stibp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='arch-capabilities'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='ssbd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='cmp_legacy'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='topoext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='virt-ssbd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='lbrv'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='tsc-scale'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='vmcb-clean'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='pause-filter'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='pfthreshold'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='svme-addr-chk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='rdctl-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='skip-l1dfl-vmentry'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='mds-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature name='pschange-mc-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <pages unit='KiB' size='4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <pages unit='KiB' size='2048'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <pages unit='KiB' size='1048576'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </cpu>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <power_management>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <suspend_mem/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </power_management>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <iommu support='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <migration_features>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <live/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <uri_transports>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <uri_transport>tcp</uri_transport>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <uri_transport>rdma</uri_transport>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </uri_transports>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </migration_features>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <topology>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <cells num='1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <cell id='0'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:          <memory unit='KiB'>7864316</memory>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:          <pages unit='KiB' size='4'>1966079</pages>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:          <pages unit='KiB' size='2048'>0</pages>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:          <distances>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:            <sibling id='0' value='10'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:          </distances>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:          <cpus num='8'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:          </cpus>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        </cell>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </cells>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </topology>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <cache>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </cache>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <secmodel>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model>selinux</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <doi>0</doi>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </secmodel>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <secmodel>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model>dac</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <doi>0</doi>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </secmodel>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </host>
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <guest>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <os_type>hvm</os_type>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <arch name='i686'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <wordsize>32</wordsize>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <domain type='qemu'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <domain type='kvm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </arch>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <features>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <pae/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <nonpae/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <acpi default='on' toggle='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <apic default='on' toggle='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <cpuselection/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <deviceboot/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <disksnapshot default='on' toggle='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <externalSnapshot/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </features>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </guest>
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <guest>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <os_type>hvm</os_type>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <arch name='x86_64'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <wordsize>64</wordsize>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <domain type='qemu'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <domain type='kvm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </arch>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <features>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <acpi default='on' toggle='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <apic default='on' toggle='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <cpuselection/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <deviceboot/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <disksnapshot default='on' toggle='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <externalSnapshot/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </features>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </guest>
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 
Dec  6 05:00:31 np0005548918 nova_compute[228253]: </capabilities>
Dec  6 05:00:31 np0005548918 nova_compute[228253]: #033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.530 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.550 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  6 05:00:31 np0005548918 nova_compute[228253]: <domainCapabilities>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <path>/usr/libexec/qemu-kvm</path>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <domain>kvm</domain>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <arch>i686</arch>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <vcpu max='240'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <iothreads supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <os supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <enum name='firmware'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <loader supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>rom</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pflash</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='readonly'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>yes</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>no</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='secure'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>no</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </loader>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </os>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <cpu>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='host-passthrough' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='hostPassthroughMigratable'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>on</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>off</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='maximum' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='maximumMigratable'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>on</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>off</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='host-model' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <vendor>AMD</vendor>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='x2apic'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='tsc-deadline'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='hypervisor'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='tsc_adjust'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='spec-ctrl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='stibp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='ssbd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='cmp_legacy'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='overflow-recov'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='succor'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='ibrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='amd-ssbd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='virt-ssbd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='lbrv'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='tsc-scale'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='vmcb-clean'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='flushbyasid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='pause-filter'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='pfthreshold'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='svme-addr-chk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='disable' name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='custom' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cooperlake'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cooperlake-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cooperlake-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Dhyana-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Genoa'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Genoa-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Milan'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Milan-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Milan-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='GraniteRapids'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='GraniteRapids-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='GraniteRapids-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10-128'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10-256'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10-512'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v6'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v7'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='KnightsMill'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='KnightsMill-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G4-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G5-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SierraForest'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SierraForest-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='athlon'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='athlon-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='core2duo'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='core2duo-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='coreduo'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='coreduo-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='n270'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='n270-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='phenom'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='phenom-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </cpu>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <memoryBacking supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <enum name='sourceType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>file</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>anonymous</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>memfd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </memoryBacking>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <devices>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <disk supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='diskDevice'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>disk</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>cdrom</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>floppy</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>lun</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='bus'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>ide</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>fdc</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>scsi</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>usb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>sata</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </disk>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <graphics supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vnc</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>egl-headless</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>dbus</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </graphics>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <video supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='modelType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vga</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>cirrus</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>none</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>bochs</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>ramfb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </video>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <hostdev supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='mode'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>subsystem</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='startupPolicy'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>default</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>mandatory</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>requisite</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>optional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='subsysType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>usb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pci</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>scsi</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='capsType'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='pciBackend'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </hostdev>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <rng supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>random</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>egd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>builtin</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </rng>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <filesystem supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='driverType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>path</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>handle</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtiofs</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </filesystem>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <tpm supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tpm-tis</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tpm-crb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>emulator</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>external</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendVersion'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>2.0</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </tpm>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <redirdev supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='bus'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>usb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </redirdev>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <channel supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pty</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>unix</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </channel>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <crypto supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>qemu</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>builtin</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </crypto>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <interface supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>default</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>passt</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </interface>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <panic supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>isa</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>hyperv</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </panic>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <console supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>null</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vc</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pty</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>dev</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>file</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pipe</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>stdio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>udp</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tcp</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>unix</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>qemu-vdagent</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>dbus</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </console>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </devices>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <features>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <gic supported='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <vmcoreinfo supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <genid supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <backingStoreInput supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <backup supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <async-teardown supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <ps2 supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <sev supported='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <sgx supported='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <hyperv supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='features'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>relaxed</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vapic</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>spinlocks</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vpindex</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>runtime</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>synic</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>stimer</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>reset</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vendor_id</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>frequencies</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>reenlightenment</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tlbflush</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>ipi</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>avic</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>emsr_bitmap</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>xmm_input</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <defaults>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <spinlocks>4095</spinlocks>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <stimer_direct>on</stimer_direct>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <tlbflush_direct>on</tlbflush_direct>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <tlbflush_extended>on</tlbflush_extended>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </defaults>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </hyperv>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <launchSecurity supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='sectype'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tdx</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </launchSecurity>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </features>
Dec  6 05:00:31 np0005548918 nova_compute[228253]: </domainCapabilities>
Dec  6 05:00:31 np0005548918 nova_compute[228253]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.556 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  6 05:00:31 np0005548918 nova_compute[228253]: <domainCapabilities>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <path>/usr/libexec/qemu-kvm</path>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <domain>kvm</domain>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <arch>i686</arch>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <vcpu max='4096'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <iothreads supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <os supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <enum name='firmware'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <loader supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>rom</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pflash</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='readonly'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>yes</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>no</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='secure'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>no</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </loader>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </os>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <cpu>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='host-passthrough' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='hostPassthroughMigratable'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>on</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>off</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='maximum' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='maximumMigratable'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>on</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>off</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='host-model' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <vendor>AMD</vendor>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='x2apic'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='tsc-deadline'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='hypervisor'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='tsc_adjust'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='spec-ctrl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='stibp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='ssbd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='cmp_legacy'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='overflow-recov'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='succor'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='ibrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='amd-ssbd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='virt-ssbd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='lbrv'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='tsc-scale'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='vmcb-clean'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='flushbyasid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='pause-filter'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='pfthreshold'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='svme-addr-chk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='disable' name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='custom' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cooperlake'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cooperlake-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cooperlake-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Dhyana-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Genoa'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Genoa-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Milan'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Milan-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Milan-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='GraniteRapids'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='GraniteRapids-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='GraniteRapids-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10-128'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10-256'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10-512'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v6'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v7'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='KnightsMill'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='KnightsMill-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G4-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G5-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SierraForest'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SierraForest-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='athlon'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='athlon-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='core2duo'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='core2duo-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='coreduo'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='coreduo-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='n270'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='n270-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='phenom'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='phenom-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </cpu>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <memoryBacking supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <enum name='sourceType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>file</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>anonymous</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>memfd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </memoryBacking>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <devices>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <disk supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='diskDevice'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>disk</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>cdrom</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>floppy</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>lun</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='bus'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>fdc</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>scsi</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>usb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>sata</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </disk>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <graphics supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vnc</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>egl-headless</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>dbus</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </graphics>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <video supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='modelType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vga</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>cirrus</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>none</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>bochs</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>ramfb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </video>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <hostdev supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='mode'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>subsystem</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='startupPolicy'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>default</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>mandatory</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>requisite</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>optional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='subsysType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>usb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pci</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>scsi</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='capsType'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='pciBackend'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </hostdev>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <rng supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>random</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>egd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>builtin</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </rng>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <filesystem supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='driverType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>path</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>handle</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtiofs</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </filesystem>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <tpm supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tpm-tis</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tpm-crb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>emulator</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>external</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendVersion'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>2.0</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </tpm>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <redirdev supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='bus'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>usb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </redirdev>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <channel supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pty</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>unix</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </channel>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <crypto supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>qemu</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>builtin</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </crypto>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <interface supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>default</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>passt</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </interface>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <panic supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>isa</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>hyperv</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </panic>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <console supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>null</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vc</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pty</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>dev</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>file</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pipe</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>stdio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>udp</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tcp</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>unix</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>qemu-vdagent</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>dbus</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </console>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </devices>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <features>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <gic supported='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <vmcoreinfo supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <genid supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <backingStoreInput supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <backup supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <async-teardown supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <ps2 supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <sev supported='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <sgx supported='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <hyperv supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='features'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>relaxed</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vapic</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>spinlocks</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vpindex</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>runtime</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>synic</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>stimer</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>reset</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vendor_id</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>frequencies</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>reenlightenment</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tlbflush</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>ipi</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>avic</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>emsr_bitmap</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>xmm_input</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <defaults>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <spinlocks>4095</spinlocks>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <stimer_direct>on</stimer_direct>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <tlbflush_direct>on</tlbflush_direct>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <tlbflush_extended>on</tlbflush_extended>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </defaults>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </hyperv>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <launchSecurity supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='sectype'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tdx</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </launchSecurity>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </features>
Dec  6 05:00:31 np0005548918 nova_compute[228253]: </domainCapabilities>
Dec  6 05:00:31 np0005548918 nova_compute[228253]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.584 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.588 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  6 05:00:31 np0005548918 nova_compute[228253]: <domainCapabilities>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <path>/usr/libexec/qemu-kvm</path>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <domain>kvm</domain>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <arch>x86_64</arch>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <vcpu max='240'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <iothreads supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <os supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <enum name='firmware'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <loader supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>rom</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pflash</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='readonly'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>yes</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>no</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='secure'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>no</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </loader>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </os>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <cpu>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='host-passthrough' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='hostPassthroughMigratable'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>on</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>off</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='maximum' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='maximumMigratable'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>on</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>off</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='host-model' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <vendor>AMD</vendor>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='x2apic'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='tsc-deadline'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='hypervisor'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='tsc_adjust'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='spec-ctrl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='stibp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='ssbd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='cmp_legacy'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='overflow-recov'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='succor'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='ibrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='amd-ssbd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='virt-ssbd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='lbrv'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='tsc-scale'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='vmcb-clean'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='flushbyasid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='pause-filter'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='pfthreshold'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='svme-addr-chk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='disable' name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='custom' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cooperlake'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cooperlake-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cooperlake-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Dhyana-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Genoa'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Genoa-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Milan'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Milan-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Milan-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='GraniteRapids'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='GraniteRapids-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='GraniteRapids-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10-128'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10-256'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10-512'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v6'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v7'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='KnightsMill'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='KnightsMill-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G4-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G5-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SierraForest'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SierraForest-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='athlon'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='athlon-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='core2duo'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='core2duo-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='coreduo'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='coreduo-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='n270'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='n270-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='phenom'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='phenom-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </cpu>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <memoryBacking supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <enum name='sourceType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>file</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>anonymous</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>memfd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </memoryBacking>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <devices>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <disk supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='diskDevice'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>disk</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>cdrom</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>floppy</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>lun</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='bus'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>ide</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>fdc</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>scsi</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>usb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>sata</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </disk>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <graphics supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vnc</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>egl-headless</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>dbus</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </graphics>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <video supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='modelType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vga</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>cirrus</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>none</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>bochs</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>ramfb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </video>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <hostdev supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='mode'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>subsystem</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='startupPolicy'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>default</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>mandatory</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>requisite</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>optional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='subsysType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>usb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pci</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>scsi</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='capsType'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='pciBackend'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </hostdev>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <rng supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>random</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>egd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>builtin</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </rng>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <filesystem supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='driverType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>path</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>handle</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtiofs</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </filesystem>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <tpm supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tpm-tis</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tpm-crb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>emulator</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>external</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendVersion'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>2.0</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </tpm>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <redirdev supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='bus'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>usb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </redirdev>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <channel supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pty</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>unix</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </channel>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <crypto supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>qemu</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>builtin</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </crypto>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <interface supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>default</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>passt</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </interface>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <panic supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>isa</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>hyperv</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </panic>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <console supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>null</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vc</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pty</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>dev</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>file</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pipe</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>stdio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>udp</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tcp</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>unix</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>qemu-vdagent</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>dbus</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </console>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </devices>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <features>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <gic supported='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <vmcoreinfo supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <genid supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <backingStoreInput supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <backup supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <async-teardown supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <ps2 supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <sev supported='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <sgx supported='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <hyperv supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='features'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>relaxed</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vapic</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>spinlocks</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vpindex</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>runtime</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>synic</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>stimer</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>reset</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vendor_id</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>frequencies</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>reenlightenment</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tlbflush</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>ipi</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>avic</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>emsr_bitmap</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>xmm_input</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <defaults>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <spinlocks>4095</spinlocks>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <stimer_direct>on</stimer_direct>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <tlbflush_direct>on</tlbflush_direct>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <tlbflush_extended>on</tlbflush_extended>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </defaults>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </hyperv>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <launchSecurity supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='sectype'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tdx</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </launchSecurity>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </features>
Dec  6 05:00:31 np0005548918 nova_compute[228253]: </domainCapabilities>
Dec  6 05:00:31 np0005548918 nova_compute[228253]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.647 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec  6 05:00:31 np0005548918 nova_compute[228253]: <domainCapabilities>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <path>/usr/libexec/qemu-kvm</path>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <domain>kvm</domain>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <arch>x86_64</arch>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <vcpu max='4096'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <iothreads supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <os supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <enum name='firmware'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>efi</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <loader supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>rom</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pflash</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='readonly'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>yes</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>no</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='secure'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>yes</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>no</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </loader>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </os>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <cpu>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='host-passthrough' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='hostPassthroughMigratable'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>on</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>off</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='maximum' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='maximumMigratable'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>on</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>off</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='host-model' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <vendor>AMD</vendor>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='x2apic'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='tsc-deadline'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='hypervisor'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='tsc_adjust'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='spec-ctrl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='stibp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='ssbd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='cmp_legacy'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='overflow-recov'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='succor'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='ibrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='amd-ssbd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='virt-ssbd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='lbrv'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='tsc-scale'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='vmcb-clean'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='flushbyasid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='pause-filter'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='pfthreshold'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='svme-addr-chk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <feature policy='disable' name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <mode name='custom' supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Broadwell-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cascadelake-Server-v5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cooperlake'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cooperlake-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Cooperlake-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Denverton-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Dhyana-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Genoa'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Genoa-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='auto-ibrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Milan'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Milan-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Milan-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amd-psfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='stibp-always-on'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-Rome-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='EPYC-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='GraniteRapids'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='GraniteRapids-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='GraniteRapids-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10-128'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10-256'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx10-512'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='prefetchiti'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Haswell-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-noTSX'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v6'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Icelake-Server-v7'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='IvyBridge-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='KnightsMill'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='KnightsMill-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512er'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512pf'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G4-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Opteron_G5-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fma4'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tbm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xop'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SapphireRapids-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='amx-tile'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-bf16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-fp16'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bitalg'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrc'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fzrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='la57'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='taa-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xfd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SierraForest'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='SierraForest-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ifma'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cmpccxadd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fbsdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='fsrs'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ibrs-all'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mcdt-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pbrsb-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='psdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='serialize'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vaes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Client-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='hle'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='rtm'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Skylake-Server-v5'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512bw'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512cd'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512dq'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512f'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='avx512vl'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='invpcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pcid'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='pku'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='mpx'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v2'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v3'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='core-capability'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='split-lock-detect'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='Snowridge-v4'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='cldemote'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='erms'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='gfni'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdir64b'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='movdiri'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='xsaves'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='athlon'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='athlon-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='core2duo'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='core2duo-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='coreduo'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='coreduo-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='n270'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='n270-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='ss'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='phenom'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <blockers model='phenom-v1'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnow'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <feature name='3dnowext'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </blockers>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </mode>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </cpu>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <memoryBacking supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <enum name='sourceType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>file</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>anonymous</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <value>memfd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </memoryBacking>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <devices>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <disk supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='diskDevice'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>disk</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>cdrom</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>floppy</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>lun</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='bus'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>fdc</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>scsi</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>usb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>sata</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </disk>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <graphics supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vnc</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>egl-headless</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>dbus</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </graphics>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <video supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='modelType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vga</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>cirrus</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>none</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>bochs</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>ramfb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </video>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <hostdev supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='mode'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>subsystem</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='startupPolicy'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>default</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>mandatory</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>requisite</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>optional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='subsysType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>usb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pci</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>scsi</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='capsType'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='pciBackend'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </hostdev>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <rng supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtio-non-transitional</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>random</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>egd</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>builtin</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </rng>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <filesystem supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='driverType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>path</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>handle</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>virtiofs</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </filesystem>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <tpm supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tpm-tis</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tpm-crb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>emulator</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>external</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendVersion'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>2.0</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </tpm>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <redirdev supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='bus'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>usb</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </redirdev>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <channel supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pty</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>unix</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </channel>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <crypto supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>qemu</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendModel'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>builtin</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </crypto>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <interface supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='backendType'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>default</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>passt</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </interface>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <panic supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='model'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>isa</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>hyperv</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </panic>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <console supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='type'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>null</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vc</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pty</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>dev</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>file</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>pipe</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>stdio</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>udp</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tcp</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>unix</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>qemu-vdagent</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>dbus</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </console>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </devices>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  <features>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <gic supported='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <vmcoreinfo supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <genid supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <backingStoreInput supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <backup supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <async-teardown supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <ps2 supported='yes'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <sev supported='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <sgx supported='no'/>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <hyperv supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='features'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>relaxed</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vapic</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>spinlocks</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vpindex</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>runtime</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>synic</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>stimer</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>reset</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>vendor_id</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>frequencies</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>reenlightenment</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tlbflush</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>ipi</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>avic</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>emsr_bitmap</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>xmm_input</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <defaults>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <spinlocks>4095</spinlocks>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <stimer_direct>on</stimer_direct>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <tlbflush_direct>on</tlbflush_direct>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <tlbflush_extended>on</tlbflush_extended>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </defaults>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </hyperv>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    <launchSecurity supported='yes'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      <enum name='sectype'>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:        <value>tdx</value>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:      </enum>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:    </launchSecurity>
Dec  6 05:00:31 np0005548918 nova_compute[228253]:  </features>
Dec  6 05:00:31 np0005548918 nova_compute[228253]: </domainCapabilities>
Dec  6 05:00:31 np0005548918 nova_compute[228253]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.710 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.710 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.710 228257 DEBUG nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.711 228257 INFO nova.virt.libvirt.host [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Secure Boot support detected#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.712 228257 INFO nova.virt.libvirt.driver [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.712 228257 INFO nova.virt.libvirt.driver [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.721 228257 DEBUG nova.virt.libvirt.driver [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.754 228257 INFO nova.virt.node [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Determined node identity 31f5f484-bf36-44de-83b8-7b434061a77b from /var/lib/nova/compute_id#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.795 228257 WARNING nova.compute.manager [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Compute nodes ['31f5f484-bf36-44de-83b8-7b434061a77b'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.821 228257 INFO nova.compute.manager [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec  6 05:00:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.856 228257 WARNING nova.compute.manager [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.856 228257 DEBUG oslo_concurrency.lockutils [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.857 228257 DEBUG oslo_concurrency.lockutils [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.857 228257 DEBUG oslo_concurrency.lockutils [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.857 228257 DEBUG nova.compute.resource_tracker [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:00:31 np0005548918 nova_compute[228253]: 2025-12-06 10:00:31.857 228257 DEBUG oslo_concurrency.processutils [None req-493c96ed-a17d-43bd-bfbe-9c7043fc9d0c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:00:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:32.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:32 np0005548918 python3.9[229161]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  6 05:00:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:32 np0005548918 systemd[1]: Stopping nova_compute container...
Dec  6 05:00:32 np0005548918 nova_compute[228253]: 2025-12-06 10:00:32.149 228257 DEBUG oslo_concurrency.lockutils [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:00:32 np0005548918 nova_compute[228253]: 2025-12-06 10:00:32.150 228257 DEBUG oslo_concurrency.lockutils [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:00:32 np0005548918 nova_compute[228253]: 2025-12-06 10:00:32.150 228257 DEBUG oslo_concurrency.lockutils [None req-7695cc54-3630-448d-b614-db434d289741 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:00:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:32 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444002550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:32 np0005548918 virtqemud[228866]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec  6 05:00:32 np0005548918 virtqemud[228866]: hostname: compute-2
Dec  6 05:00:32 np0005548918 virtqemud[228866]: End of file while reading data: Input/output error
Dec  6 05:00:32 np0005548918 systemd[1]: libpod-e606f831b872fe1428074d2a7ac6e3bb3b9019488f94059468e11bd866eff6d8.scope: Deactivated successfully.
Dec  6 05:00:32 np0005548918 systemd[1]: libpod-e606f831b872fe1428074d2a7ac6e3bb3b9019488f94059468e11bd866eff6d8.scope: Consumed 3.540s CPU time.
Dec  6 05:00:32 np0005548918 podman[229186]: 2025-12-06 10:00:32.610137928 +0000 UTC m=+0.504305102 container died e606f831b872fe1428074d2a7ac6e3bb3b9019488f94059468e11bd866eff6d8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  6 05:00:32 np0005548918 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e606f831b872fe1428074d2a7ac6e3bb3b9019488f94059468e11bd866eff6d8-userdata-shm.mount: Deactivated successfully.
Dec  6 05:00:32 np0005548918 systemd[1]: var-lib-containers-storage-overlay-cd86534d7c9d82a41ff055d3f6da2d7f82aa5de9c3960bcfdaf58572bde0bfbf-merged.mount: Deactivated successfully.
Dec  6 05:00:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:33 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:33 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:33.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:34.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:34 np0005548918 podman[229186]: 2025-12-06 10:00:34.057920782 +0000 UTC m=+1.952087956 container cleanup e606f831b872fe1428074d2a7ac6e3bb3b9019488f94059468e11bd866eff6d8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  6 05:00:34 np0005548918 podman[229186]: nova_compute
Dec  6 05:00:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:34 np0005548918 podman[229218]: nova_compute
Dec  6 05:00:34 np0005548918 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec  6 05:00:34 np0005548918 systemd[1]: Stopped nova_compute container.
Dec  6 05:00:34 np0005548918 systemd[1]: Starting nova_compute container...
Dec  6 05:00:34 np0005548918 systemd[1]: Started libcrun container.
Dec  6 05:00:34 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd86534d7c9d82a41ff055d3f6da2d7f82aa5de9c3960bcfdaf58572bde0bfbf/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:34 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd86534d7c9d82a41ff055d3f6da2d7f82aa5de9c3960bcfdaf58572bde0bfbf/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:34 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd86534d7c9d82a41ff055d3f6da2d7f82aa5de9c3960bcfdaf58572bde0bfbf/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:34 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd86534d7c9d82a41ff055d3f6da2d7f82aa5de9c3960bcfdaf58572bde0bfbf/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:34 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd86534d7c9d82a41ff055d3f6da2d7f82aa5de9c3960bcfdaf58572bde0bfbf/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:34 np0005548918 podman[229231]: 2025-12-06 10:00:34.233682428 +0000 UTC m=+0.084866877 container init e606f831b872fe1428074d2a7ac6e3bb3b9019488f94059468e11bd866eff6d8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm)
Dec  6 05:00:34 np0005548918 podman[229231]: 2025-12-06 10:00:34.238834308 +0000 UTC m=+0.090018737 container start e606f831b872fe1428074d2a7ac6e3bb3b9019488f94059468e11bd866eff6d8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute)
Dec  6 05:00:34 np0005548918 podman[229231]: nova_compute
Dec  6 05:00:34 np0005548918 nova_compute[229246]: + sudo -E kolla_set_configs
Dec  6 05:00:34 np0005548918 systemd[1]: Started nova_compute container.
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Validating config file
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Copying service configuration files
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Deleting /etc/ceph
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Creating directory /etc/ceph
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /etc/ceph
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Writing out command to execute
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:34 np0005548918 nova_compute[229246]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  6 05:00:34 np0005548918 nova_compute[229246]: ++ cat /run_command
Dec  6 05:00:34 np0005548918 nova_compute[229246]: + CMD=nova-compute
Dec  6 05:00:34 np0005548918 nova_compute[229246]: + ARGS=
Dec  6 05:00:34 np0005548918 nova_compute[229246]: + sudo kolla_copy_cacerts
Dec  6 05:00:34 np0005548918 nova_compute[229246]: + [[ ! -n '' ]]
Dec  6 05:00:34 np0005548918 nova_compute[229246]: + . kolla_extend_start
Dec  6 05:00:34 np0005548918 nova_compute[229246]: + echo 'Running command: '\''nova-compute'\'''
Dec  6 05:00:34 np0005548918 nova_compute[229246]: Running command: 'nova-compute'
Dec  6 05:00:34 np0005548918 nova_compute[229246]: + umask 0022
Dec  6 05:00:34 np0005548918 nova_compute[229246]: + exec nova-compute
Dec  6 05:00:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:34 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:00:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:35 np0005548918 python3.9[229409]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  6 05:00:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:35 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444002550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:35 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:35 np0005548918 systemd[1]: Started libpod-conmon-49753e8c51e19d45c484ca951432b85368bb904bf83e249539a4d053f674edeb.scope.
Dec  6 05:00:35 np0005548918 systemd[1]: Started libcrun container.
Dec  6 05:00:35 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67b9324141bcc6258b9b7a8deeceec191e60d59b6e8a6ca474b5fc0f4f05ad07/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:35 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67b9324141bcc6258b9b7a8deeceec191e60d59b6e8a6ca474b5fc0f4f05ad07/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:35 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67b9324141bcc6258b9b7a8deeceec191e60d59b6e8a6ca474b5fc0f4f05ad07/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  6 05:00:35 np0005548918 podman[229435]: 2025-12-06 10:00:35.25369504 +0000 UTC m=+0.122179211 container init 49753e8c51e19d45c484ca951432b85368bb904bf83e249539a4d053f674edeb (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Dec  6 05:00:35 np0005548918 podman[229435]: 2025-12-06 10:00:35.261347628 +0000 UTC m=+0.129831789 container start 49753e8c51e19d45c484ca951432b85368bb904bf83e249539a4d053f674edeb (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:00:35 np0005548918 python3.9[229409]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec  6 05:00:35 np0005548918 podman[229452]: 2025-12-06 10:00:35.278381711 +0000 UTC m=+0.065173442 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  6 05:00:35 np0005548918 nova_compute_init[229476]: INFO:nova_statedir:Applying nova statedir ownership
Dec  6 05:00:35 np0005548918 nova_compute_init[229476]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec  6 05:00:35 np0005548918 nova_compute_init[229476]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec  6 05:00:35 np0005548918 nova_compute_init[229476]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec  6 05:00:35 np0005548918 nova_compute_init[229476]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec  6 05:00:35 np0005548918 nova_compute_init[229476]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec  6 05:00:35 np0005548918 nova_compute_init[229476]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec  6 05:00:35 np0005548918 nova_compute_init[229476]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec  6 05:00:35 np0005548918 nova_compute_init[229476]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec  6 05:00:35 np0005548918 nova_compute_init[229476]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec  6 05:00:35 np0005548918 nova_compute_init[229476]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec  6 05:00:35 np0005548918 nova_compute_init[229476]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec  6 05:00:35 np0005548918 nova_compute_init[229476]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec  6 05:00:35 np0005548918 nova_compute_init[229476]: INFO:nova_statedir:Nova statedir ownership complete
Dec  6 05:00:35 np0005548918 systemd[1]: libpod-49753e8c51e19d45c484ca951432b85368bb904bf83e249539a4d053f674edeb.scope: Deactivated successfully.
Dec  6 05:00:35 np0005548918 podman[229492]: 2025-12-06 10:00:35.348722372 +0000 UTC m=+0.021850255 container died 49753e8c51e19d45c484ca951432b85368bb904bf83e249539a4d053f674edeb (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  6 05:00:35 np0005548918 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-49753e8c51e19d45c484ca951432b85368bb904bf83e249539a4d053f674edeb-userdata-shm.mount: Deactivated successfully.
Dec  6 05:00:35 np0005548918 systemd[1]: var-lib-containers-storage-overlay-67b9324141bcc6258b9b7a8deeceec191e60d59b6e8a6ca474b5fc0f4f05ad07-merged.mount: Deactivated successfully.
Dec  6 05:00:35 np0005548918 podman[229492]: 2025-12-06 10:00:35.3803313 +0000 UTC m=+0.053459173 container cleanup 49753e8c51e19d45c484ca951432b85368bb904bf83e249539a4d053f674edeb (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, tcib_managed=true)
Dec  6 05:00:35 np0005548918 systemd[1]: libpod-conmon-49753e8c51e19d45c484ca951432b85368bb904bf83e249539a4d053f674edeb.scope: Deactivated successfully.
Dec  6 05:00:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:35.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:00:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:00:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:36.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:00:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:36 np0005548918 systemd[1]: session-53.scope: Deactivated successfully.
Dec  6 05:00:36 np0005548918 systemd[1]: session-53.scope: Consumed 2min 24.053s CPU time.
Dec  6 05:00:36 np0005548918 systemd-logind[800]: Session 53 logged out. Waiting for processes to exit.
Dec  6 05:00:36 np0005548918 systemd-logind[800]: Removed session 53.
Dec  6 05:00:36 np0005548918 nova_compute[229246]: 2025-12-06 10:00:36.352 229250 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  6 05:00:36 np0005548918 nova_compute[229246]: 2025-12-06 10:00:36.353 229250 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  6 05:00:36 np0005548918 nova_compute[229246]: 2025-12-06 10:00:36.353 229250 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  6 05:00:36 np0005548918 nova_compute[229246]: 2025-12-06 10:00:36.354 229250 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  6 05:00:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:36 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:36 np0005548918 nova_compute[229246]: 2025-12-06 10:00:36.501 229250 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:00:36 np0005548918 nova_compute[229246]: 2025-12-06 10:00:36.524 229250 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:00:36 np0005548918 nova_compute[229246]: 2025-12-06 10:00:36.525 229250 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  6 05:00:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:00:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:36 np0005548918 nova_compute[229246]: 2025-12-06 10:00:36.951 229250 INFO nova.virt.driver [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.055 229250 INFO nova.compute.provider_config [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  6 05:00:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:37 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.063 229250 DEBUG oslo_concurrency.lockutils [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.063 229250 DEBUG oslo_concurrency.lockutils [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.063 229250 DEBUG oslo_concurrency.lockutils [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.064 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.064 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.064 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.064 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.065 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.065 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.065 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.065 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.065 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.066 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.066 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.066 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.066 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.066 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.067 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.067 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.067 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.067 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.067 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.068 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.068 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.068 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.068 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.069 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.069 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.069 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.069 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.069 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.070 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.070 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.070 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.070 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.070 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.071 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.071 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.071 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.071 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.071 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.072 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.072 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.072 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.072 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.073 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.073 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.073 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.073 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.074 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.074 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.074 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.074 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.075 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.075 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.075 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.075 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.076 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.076 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.076 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.076 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.076 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.077 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:00:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.077 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.077 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.077 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.077 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.077 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.078 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.078 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.078 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.078 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.078 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.079 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.079 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.079 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.079 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.079 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.079 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.080 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.080 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.080 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.080 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.081 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.081 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.081 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.081 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.081 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.082 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.082 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.082 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.082 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.082 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.082 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.083 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.083 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.083 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.083 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.083 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.084 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.084 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.084 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.084 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.084 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.085 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.085 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.085 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.085 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.085 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.086 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.086 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.086 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.086 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.086 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.087 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.087 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.087 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.087 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.088 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.088 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.088 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.088 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.088 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.089 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.089 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.089 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.089 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.090 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.090 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.090 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.090 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.090 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.091 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.091 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.091 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.091 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.091 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.092 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.092 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.092 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.092 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.092 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.093 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.093 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.093 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.093 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.094 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.094 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.094 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.094 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.094 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.095 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.095 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.095 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.095 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.095 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.096 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.096 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.096 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.096 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.096 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.097 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.097 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.097 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.097 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.097 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.098 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.098 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.098 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.098 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.099 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.099 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.099 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.099 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.099 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.100 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.100 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.100 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.100 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.100 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.101 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.101 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.101 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.101 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.102 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.102 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.102 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.102 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.102 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.103 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.103 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.103 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.103 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.103 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.104 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.104 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.104 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.104 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.104 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.105 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.105 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.105 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.105 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.106 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.106 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.106 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.106 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.106 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.106 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.107 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.107 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.107 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.107 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.107 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.108 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.108 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.108 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.108 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.109 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.109 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.109 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.109 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.109 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.110 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.110 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.110 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.110 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.110 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.111 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.111 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.111 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.111 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.112 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.112 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.112 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.112 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.112 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.113 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.113 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.113 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.113 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.113 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.114 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.114 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.114 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.114 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.115 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.115 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.115 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.115 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.115 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.116 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.116 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.116 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.116 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.116 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.117 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.117 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.117 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.117 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.117 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.118 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.118 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.118 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.118 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.118 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.119 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.119 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.119 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.119 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.119 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.120 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.120 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.120 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.120 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.121 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.121 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.121 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.121 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.121 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.121 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.122 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.122 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.122 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.122 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.122 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.123 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.123 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.123 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.123 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.123 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.124 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.124 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.124 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.124 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.124 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.125 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.125 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.125 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.125 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.125 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.125 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.126 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.126 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.126 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.127 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.127 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.127 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.127 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.127 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.127 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.128 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.128 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.128 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.128 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.128 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.129 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.129 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.129 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.129 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.129 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.130 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.130 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.130 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.130 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.130 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.131 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.131 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.131 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.131 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.131 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.132 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.132 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.132 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.132 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.132 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.133 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.133 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.133 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.133 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.133 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.134 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.134 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.134 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.134 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.134 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.135 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.135 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.135 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.135 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.135 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.135 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.136 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.136 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.136 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.136 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.136 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.137 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.137 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.137 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.137 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.138 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.138 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.138 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.138 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.138 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.139 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.139 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.139 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.139 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.139 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.140 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.140 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.140 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.140 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.140 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.141 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.141 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.141 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.141 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.141 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.142 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.142 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.142 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.142 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.142 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.143 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.143 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.143 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.143 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.143 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.143 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.144 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.144 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.144 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.144 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.144 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.145 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.145 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.145 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.145 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.145 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.146 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.146 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.146 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.146 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.146 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.147 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.147 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.147 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.147 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.147 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.147 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.148 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.148 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.148 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.148 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.148 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.149 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.149 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.149 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.149 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.149 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.150 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.150 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.150 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.150 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.150 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.150 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.151 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.151 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.151 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.151 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.151 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.152 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.152 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.152 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.152 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.152 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.153 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.153 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.153 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.153 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.153 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.153 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.154 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.154 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.154 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.154 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.155 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.155 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.155 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.155 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.155 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.156 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.156 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.156 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.156 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.156 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.157 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.157 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.157 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.157 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.157 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.157 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.158 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.158 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.158 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.158 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.158 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.159 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.159 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.159 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.159 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.159 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.160 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.160 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.160 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.160 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.160 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.161 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.161 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.161 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.161 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.161 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.162 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.162 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.162 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.162 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.163 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.163 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.163 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.163 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.163 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.164 229250 WARNING oslo_config.cfg [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  6 05:00:37 np0005548918 nova_compute[229246]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  6 05:00:37 np0005548918 nova_compute[229246]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  6 05:00:37 np0005548918 nova_compute[229246]: and ``live_migration_inbound_addr`` respectively.
Dec  6 05:00:37 np0005548918 nova_compute[229246]: ).  Its value may be silently ignored in the future.#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.164 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.164 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.164 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.165 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.165 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.165 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.165 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.166 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.166 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.166 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.166 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.166 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.167 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.167 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.167 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.167 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.167 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.168 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.168 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.rbd_secret_uuid        = 5ecd3f74-dade-5fc4-92ce-8950ae424258 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.168 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.168 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.168 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.169 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.169 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.169 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.169 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.169 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.170 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.170 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.170 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.170 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.170 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.171 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.171 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.171 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.171 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.171 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.172 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.172 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.172 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.172 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.172 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.173 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.173 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.173 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.173 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.173 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.174 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.174 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.174 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.174 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.175 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.175 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.175 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.175 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.176 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.176 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.176 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.176 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.176 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.177 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.177 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.177 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.177 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.178 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.178 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.178 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.178 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.178 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.179 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.179 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.179 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.179 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.179 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.179 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.180 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.180 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.180 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.180 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.180 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.181 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.181 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.181 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.181 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.181 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.182 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.182 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.182 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:00:37 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444002550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.182 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.182 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.183 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.183 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.183 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.183 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.183 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.183 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.184 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.184 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.184 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.184 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.184 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.185 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.185 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.185 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.185 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.185 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.186 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.186 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.186 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.186 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.186 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.187 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.187 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.187 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.187 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.187 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.187 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.188 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.188 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.188 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.188 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.188 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.189 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.189 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.189 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.189 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.190 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.190 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.190 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.190 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.190 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.190 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.191 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.191 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.191 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.191 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.191 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.192 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.192 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.192 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.192 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.193 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.193 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.193 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.193 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.193 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.194 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.194 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.194 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.194 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.195 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.195 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.195 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.195 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.195 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.196 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.196 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.196 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.197 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.197 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.197 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.197 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.197 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.198 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.198 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.198 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.198 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.198 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.199 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.199 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.199 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.199 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.199 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.200 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.200 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.200 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.200 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.200 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.201 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.201 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.201 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.201 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.201 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.202 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.202 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.202 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.202 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.202 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.203 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.203 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.203 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.203 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.203 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.204 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.204 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.204 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.204 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.204 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.205 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.205 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.205 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.205 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.205 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.206 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.206 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.206 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.206 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.206 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.207 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.207 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.207 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.207 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.207 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.208 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.208 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.208 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.208 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.208 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.209 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.209 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.209 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.209 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.210 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.210 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.210 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.210 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.210 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.211 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.211 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.211 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.211 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.211 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.211 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.212 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.212 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.212 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.212 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.212 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.213 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.213 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.213 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.213 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.213 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.214 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.214 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.214 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.214 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.214 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.215 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.215 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.215 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.216 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.216 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.216 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.216 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.216 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.217 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.217 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.217 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.217 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.217 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.218 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.218 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.218 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.218 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.218 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.219 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.219 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.219 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.219 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.220 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.220 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.220 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.220 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.220 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.221 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.221 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.221 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.221 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.221 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.222 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.222 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.222 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.222 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.223 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.223 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.223 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.223 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.223 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.224 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.224 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.224 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.224 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.224 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.225 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.225 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.225 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.225 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.226 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.226 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.226 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.226 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.226 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.227 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.227 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.227 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.227 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.227 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.228 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.228 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.228 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.228 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.229 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.229 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.229 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.229 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.229 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.230 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.230 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.230 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.230 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.231 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.231 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.231 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.231 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.231 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.232 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.232 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.232 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.232 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.233 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.233 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.233 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.233 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.233 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.234 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.234 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.234 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.234 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.234 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.235 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.235 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.235 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.235 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.235 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.236 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.236 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.236 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.236 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.236 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.236 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.237 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.237 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.237 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.237 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.237 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.238 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.238 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.238 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.238 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.238 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.239 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.239 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.239 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.239 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.239 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.240 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.240 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.240 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.240 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.241 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.241 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.241 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.241 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.241 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.242 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.242 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.242 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.242 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.242 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.243 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.243 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.243 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.243 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.243 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.243 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.244 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.244 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.244 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.244 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.244 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.245 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.245 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.245 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.245 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.245 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.246 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.246 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.246 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.246 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.246 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.247 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.247 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.247 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.247 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.247 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.248 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.248 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.248 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.248 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.248 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.249 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.249 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.249 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.249 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.249 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.250 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.250 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.250 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.250 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.250 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.251 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.251 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.251 229250 DEBUG oslo_service.service [None req-4b04fb83-e547-43c7-b3b8-088e9aa03716 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.252 229250 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.269 229250 INFO nova.virt.node [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Determined node identity 31f5f484-bf36-44de-83b8-7b434061a77b from /var/lib/nova/compute_id#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.270 229250 DEBUG nova.virt.libvirt.host [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.271 229250 DEBUG nova.virt.libvirt.host [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.271 229250 DEBUG nova.virt.libvirt.host [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.271 229250 DEBUG nova.virt.libvirt.host [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.283 229250 DEBUG nova.virt.libvirt.host [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f4580a419a0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.287 229250 DEBUG nova.virt.libvirt.host [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f4580a419a0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.288 229250 INFO nova.virt.libvirt.driver [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.299 229250 INFO nova.virt.libvirt.host [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Libvirt host capabilities <capabilities>
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <host>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <uuid>9abb64ab-b4a8-4a69-9492-3018ef71c6f2</uuid>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <cpu>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <arch>x86_64</arch>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model>EPYC-Rome-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <vendor>AMD</vendor>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <microcode version='16777317'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <signature family='23' model='49' stepping='0'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='x2apic'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='tsc-deadline'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='osxsave'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='hypervisor'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='tsc_adjust'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='spec-ctrl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='stibp'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='arch-capabilities'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='ssbd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='cmp_legacy'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='topoext'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='virt-ssbd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='lbrv'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='tsc-scale'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='vmcb-clean'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='pause-filter'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='pfthreshold'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='svme-addr-chk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='rdctl-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='skip-l1dfl-vmentry'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='mds-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature name='pschange-mc-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <pages unit='KiB' size='4'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <pages unit='KiB' size='2048'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <pages unit='KiB' size='1048576'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </cpu>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <power_management>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <suspend_mem/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </power_management>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <iommu support='no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <migration_features>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <live/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <uri_transports>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <uri_transport>tcp</uri_transport>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <uri_transport>rdma</uri_transport>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </uri_transports>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </migration_features>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <topology>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <cells num='1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <cell id='0'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:          <memory unit='KiB'>7864316</memory>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:          <pages unit='KiB' size='4'>1966079</pages>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:          <pages unit='KiB' size='2048'>0</pages>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:          <distances>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:            <sibling id='0' value='10'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:          </distances>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:          <cpus num='8'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:          </cpus>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        </cell>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </cells>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </topology>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <cache>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </cache>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <secmodel>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model>selinux</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <doi>0</doi>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </secmodel>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <secmodel>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model>dac</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <doi>0</doi>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </secmodel>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  </host>
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <guest>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <os_type>hvm</os_type>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <arch name='i686'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <wordsize>32</wordsize>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <domain type='qemu'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <domain type='kvm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </arch>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <features>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <pae/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <nonpae/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <acpi default='on' toggle='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <apic default='on' toggle='no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <cpuselection/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <deviceboot/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <disksnapshot default='on' toggle='no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <externalSnapshot/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </features>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  </guest>
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <guest>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <os_type>hvm</os_type>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <arch name='x86_64'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <wordsize>64</wordsize>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <domain type='qemu'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <domain type='kvm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </arch>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <features>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <acpi default='on' toggle='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <apic default='on' toggle='no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <cpuselection/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <deviceboot/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <disksnapshot default='on' toggle='no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <externalSnapshot/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </features>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  </guest>
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 
Dec  6 05:00:37 np0005548918 nova_compute[229246]: </capabilities>
Dec  6 05:00:37 np0005548918 nova_compute[229246]: #033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.306 229250 DEBUG nova.virt.libvirt.volume.mount [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.309 229250 DEBUG nova.virt.libvirt.host [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.316 229250 DEBUG nova.virt.libvirt.host [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  6 05:00:37 np0005548918 nova_compute[229246]: <domainCapabilities>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <path>/usr/libexec/qemu-kvm</path>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <domain>kvm</domain>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <arch>i686</arch>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <vcpu max='240'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <iothreads supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <os supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <enum name='firmware'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <loader supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='type'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>rom</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>pflash</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='readonly'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>yes</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>no</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='secure'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>no</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </loader>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  </os>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <cpu>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <mode name='host-passthrough' supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='hostPassthroughMigratable'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>on</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>off</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </mode>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <mode name='maximum' supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='maximumMigratable'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>on</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>off</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </mode>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <mode name='host-model' supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <vendor>AMD</vendor>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='x2apic'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='tsc-deadline'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='hypervisor'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='tsc_adjust'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='spec-ctrl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='stibp'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='ssbd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='cmp_legacy'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='overflow-recov'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='succor'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='ibrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='amd-ssbd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='virt-ssbd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='lbrv'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='tsc-scale'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='vmcb-clean'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='flushbyasid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='pause-filter'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='pfthreshold'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='svme-addr-chk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='disable' name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </mode>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <mode name='custom' supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-noTSX'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v5'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cooperlake'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cooperlake-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cooperlake-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Denverton'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Denverton-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Denverton-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Denverton-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Dhyana-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Genoa'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='auto-ibrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Genoa-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='auto-ibrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Milan'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Milan-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Milan-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Rome'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Rome-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Rome-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Rome-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='GraniteRapids'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='GraniteRapids-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='GraniteRapids-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx10'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx10-128'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx10-256'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx10-512'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-noTSX'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-noTSX'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v5'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v6'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v7'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='IvyBridge'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='IvyBridge-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='IvyBridge-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='IvyBridge-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='KnightsMill'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512er'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512pf'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='KnightsMill-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512er'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512pf'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Opteron_G4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Opteron_G4-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Opteron_G5'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tbm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Opteron_G5-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tbm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='SapphireRapids'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='SapphireRapids-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='SapphireRapids-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='SapphireRapids-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='SierraForest'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cmpccxadd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='SierraForest-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cmpccxadd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Client'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Client-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Client-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Client-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Client-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Client-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server-v5'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Snowridge'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Snowridge-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Snowridge-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Snowridge-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Snowridge-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='athlon'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='athlon-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='core2duo'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='core2duo-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='coreduo'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='coreduo-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='n270'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='n270-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='phenom'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='phenom-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </mode>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  </cpu>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <memoryBacking supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <enum name='sourceType'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <value>file</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <value>anonymous</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <value>memfd</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  </memoryBacking>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <devices>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <disk supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='diskDevice'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>disk</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>cdrom</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>floppy</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>lun</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='bus'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>ide</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>fdc</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>scsi</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>usb</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>sata</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='model'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio-transitional</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio-non-transitional</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </disk>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <graphics supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='type'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>vnc</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>egl-headless</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>dbus</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </graphics>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <video supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='modelType'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>vga</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>cirrus</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>none</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>bochs</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>ramfb</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </video>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <hostdev supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='mode'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>subsystem</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='startupPolicy'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>default</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>mandatory</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>requisite</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>optional</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='subsysType'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>usb</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>pci</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>scsi</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='capsType'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='pciBackend'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </hostdev>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <rng supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='model'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio-transitional</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio-non-transitional</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>random</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>egd</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>builtin</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </rng>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <filesystem supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='driverType'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>path</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>handle</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtiofs</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </filesystem>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <tpm supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='model'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>tpm-tis</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>tpm-crb</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>emulator</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>external</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='backendVersion'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>2.0</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </tpm>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <redirdev supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='bus'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>usb</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </redirdev>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <channel supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='type'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>pty</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>unix</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </channel>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <crypto supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='model'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='type'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>qemu</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>builtin</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </crypto>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <interface supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='backendType'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>default</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>passt</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </interface>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <panic supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='model'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>isa</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>hyperv</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </panic>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <console supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='type'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>null</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>vc</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>pty</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>dev</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>file</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>pipe</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>stdio</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>udp</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>tcp</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>unix</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>qemu-vdagent</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>dbus</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </console>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  </devices>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <features>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <gic supported='no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <vmcoreinfo supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <genid supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <backingStoreInput supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <backup supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <async-teardown supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <ps2 supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <sev supported='no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <sgx supported='no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <hyperv supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='features'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>relaxed</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>vapic</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>spinlocks</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>vpindex</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>runtime</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>synic</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>stimer</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>reset</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>vendor_id</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>frequencies</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>reenlightenment</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>tlbflush</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>ipi</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>avic</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>emsr_bitmap</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>xmm_input</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <defaults>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <spinlocks>4095</spinlocks>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <stimer_direct>on</stimer_direct>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <tlbflush_direct>on</tlbflush_direct>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <tlbflush_extended>on</tlbflush_extended>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </defaults>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </hyperv>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <launchSecurity supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='sectype'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>tdx</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </launchSecurity>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  </features>
Dec  6 05:00:37 np0005548918 nova_compute[229246]: </domainCapabilities>
Dec  6 05:00:37 np0005548918 nova_compute[229246]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.326 229250 DEBUG nova.virt.libvirt.host [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  6 05:00:37 np0005548918 nova_compute[229246]: <domainCapabilities>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <path>/usr/libexec/qemu-kvm</path>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <domain>kvm</domain>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <arch>i686</arch>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <vcpu max='4096'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <iothreads supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <os supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <enum name='firmware'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <loader supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='type'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>rom</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>pflash</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='readonly'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>yes</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>no</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='secure'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>no</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </loader>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  </os>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <cpu>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <mode name='host-passthrough' supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='hostPassthroughMigratable'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>on</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>off</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </mode>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <mode name='maximum' supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='maximumMigratable'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>on</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>off</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </mode>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <mode name='host-model' supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <vendor>AMD</vendor>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='x2apic'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='tsc-deadline'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='hypervisor'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='tsc_adjust'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='spec-ctrl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='stibp'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='ssbd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='cmp_legacy'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='overflow-recov'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='succor'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='ibrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='amd-ssbd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='virt-ssbd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='lbrv'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='tsc-scale'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='vmcb-clean'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='flushbyasid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='pause-filter'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='pfthreshold'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='svme-addr-chk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='disable' name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </mode>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <mode name='custom' supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-noTSX'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v5'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cooperlake'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cooperlake-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cooperlake-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Denverton'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Denverton-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Denverton-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Denverton-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Dhyana-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Genoa'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='auto-ibrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Genoa-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='auto-ibrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Milan'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Milan-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Milan-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Rome'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Rome-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Rome-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Rome-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='GraniteRapids'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='GraniteRapids-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='GraniteRapids-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx10'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx10-128'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx10-256'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx10-512'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-noTSX'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-noTSX'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v5'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v6'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v7'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='IvyBridge'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='IvyBridge-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='IvyBridge-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='IvyBridge-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='KnightsMill'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512er'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512pf'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='KnightsMill-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-4fmaps'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-4vnniw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512er'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512pf'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Opteron_G4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Opteron_G4-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Opteron_G5'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tbm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Opteron_G5-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fma4'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tbm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xop'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='SapphireRapids'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='SapphireRapids-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='SapphireRapids-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='SapphireRapids-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='SierraForest'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cmpccxadd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='SierraForest-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-ne-convert'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cmpccxadd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Client'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Client-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Client-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Client-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Client-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Client-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Skylake-Server-v5'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Snowridge'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Snowridge-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Snowridge-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Snowridge-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='core-capability'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='split-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Snowridge-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='athlon'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='athlon-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='core2duo'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='core2duo-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='coreduo'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='coreduo-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='n270'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='n270-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='phenom'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='phenom-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnow'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='3dnowext'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </mode>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  </cpu>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <memoryBacking supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <enum name='sourceType'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <value>file</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <value>anonymous</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <value>memfd</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  </memoryBacking>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <devices>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <disk supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='diskDevice'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>disk</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>cdrom</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>floppy</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>lun</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='bus'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>fdc</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>scsi</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>usb</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>sata</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='model'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio-transitional</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio-non-transitional</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </disk>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <graphics supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='type'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>vnc</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>egl-headless</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>dbus</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </graphics>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <video supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='modelType'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>vga</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>cirrus</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>none</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>bochs</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>ramfb</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </video>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <hostdev supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='mode'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>subsystem</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='startupPolicy'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>default</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>mandatory</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>requisite</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>optional</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='subsysType'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>usb</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>pci</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>scsi</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='capsType'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='pciBackend'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </hostdev>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <rng supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='model'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio-transitional</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtio-non-transitional</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>random</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>egd</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>builtin</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </rng>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <filesystem supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='driverType'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>path</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>handle</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>virtiofs</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </filesystem>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <tpm supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='model'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>tpm-tis</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>tpm-crb</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>emulator</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>external</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='backendVersion'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>2.0</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </tpm>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <redirdev supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='bus'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>usb</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </redirdev>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <channel supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='type'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>pty</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>unix</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </channel>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <crypto supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='model'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='type'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>qemu</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='backendModel'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>builtin</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </crypto>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <interface supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='backendType'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>default</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>passt</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </interface>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <panic supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='model'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>isa</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>hyperv</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </panic>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <console supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='type'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>null</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>vc</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>pty</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>dev</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>file</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>pipe</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>stdio</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>udp</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>tcp</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>unix</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>qemu-vdagent</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>dbus</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </console>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  </devices>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <features>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <gic supported='no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <vmcoreinfo supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <genid supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <backingStoreInput supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <backup supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <async-teardown supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <ps2 supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <sev supported='no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <sgx supported='no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <hyperv supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='features'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>relaxed</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>vapic</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>spinlocks</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>vpindex</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>runtime</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>synic</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>stimer</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>reset</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>vendor_id</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>frequencies</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>reenlightenment</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>tlbflush</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>ipi</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>avic</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>emsr_bitmap</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>xmm_input</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <defaults>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <spinlocks>4095</spinlocks>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <stimer_direct>on</stimer_direct>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <tlbflush_direct>on</tlbflush_direct>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <tlbflush_extended>on</tlbflush_extended>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </defaults>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </hyperv>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <launchSecurity supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='sectype'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>tdx</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </launchSecurity>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  </features>
Dec  6 05:00:37 np0005548918 nova_compute[229246]: </domainCapabilities>
Dec  6 05:00:37 np0005548918 nova_compute[229246]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.352 229250 DEBUG nova.virt.libvirt.host [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  6 05:00:37 np0005548918 nova_compute[229246]: 2025-12-06 10:00:37.365 229250 DEBUG nova.virt.libvirt.host [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  6 05:00:37 np0005548918 nova_compute[229246]: <domainCapabilities>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <path>/usr/libexec/qemu-kvm</path>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <domain>kvm</domain>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <arch>x86_64</arch>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <vcpu max='240'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <iothreads supported='yes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <os supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <enum name='firmware'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <loader supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='type'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>rom</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>pflash</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='readonly'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>yes</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>no</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='secure'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>no</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </loader>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  </os>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:  <cpu>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <mode name='host-passthrough' supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='hostPassthroughMigratable'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>on</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>off</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </mode>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <mode name='maximum' supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <enum name='maximumMigratable'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>on</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <value>off</value>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </enum>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </mode>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <mode name='host-model' supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <vendor>AMD</vendor>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='x2apic'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='tsc-deadline'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='hypervisor'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='tsc_adjust'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='spec-ctrl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='stibp'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='ssbd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='cmp_legacy'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='overflow-recov'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='succor'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='ibrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='amd-ssbd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='virt-ssbd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='lbrv'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='tsc-scale'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='vmcb-clean'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='flushbyasid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='pause-filter'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='pfthreshold'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='svme-addr-chk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <feature policy='disable' name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    </mode>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:    <mode name='custom' supported='yes'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-noTSX'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Broadwell-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cascadelake-Server-v5'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cooperlake'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cooperlake-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Cooperlake-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Denverton'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Denverton-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mpx'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Denverton-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Denverton-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Dhyana-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Genoa'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='auto-ibrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Genoa-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='auto-ibrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Milan'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Milan-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Milan-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amd-psfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='no-nested-data-bp'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='null-sel-clr-base'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='stibp-always-on'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Rome'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Rome-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Rome-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-Rome-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='EPYC-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='GraniteRapids'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='GraniteRapids-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='GraniteRapids-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-int8'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='amx-tile'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx-vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx10'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx10-128'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx10-256'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx10-512'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-bf16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-fp16'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512ifma'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='bus-lock-detect'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='cldemote'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fbsdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrc'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fsrs'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='fzrm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ibrs-all'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='mcdt-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdir64b'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='movdiri'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pbrsb-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='prefetchiti'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='psdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='sbdr-ssdp-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='serialize'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='ss'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='taa-no'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='tsx-ldtrk'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xfd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='xsaves'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-noTSX'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-v2'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-v3'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Haswell-v4'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-noTSX'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vpclmulqdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      </blockers>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:      <blockers model='Icelake-Server-v1'>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512-vpopcntdq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bitalg'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512bw'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512cd'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512dq'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512f'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vbmi2'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vl'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='avx512vnni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='erms'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='gfni'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='hle'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='invpcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='la57'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pcid'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='pku'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='rtm'/>
Dec  6 05:00:37 np0005548918 nova_compute[229246]:        <feature name='vaes'/>
Dec  6 05:01:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:30 np0005548918 rsyslogd[1011]: imjournal: 3023 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec  6 05:01:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:01:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:30.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:01:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:30 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c0048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:30 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:01:30 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:01:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:31 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:31 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:01:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:31.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:01:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:01:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:32.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:01:32 np0005548918 podman[229978]: 2025-12-06 10:01:32.208835972 +0000 UTC m=+0.086942524 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:01:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:32 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:33 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c0048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:33 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:33.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:34.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:34 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:35 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:35 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c0048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:35.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:36.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:36 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.538 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.539 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.539 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.539 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.589 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.589 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.589 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.590 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.590 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.590 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.591 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.591 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.591 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.659 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.659 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.659 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.659 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:01:36 np0005548918 nova_compute[229246]: 2025-12-06 10:01:36.660 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:01:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:37 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:01:37 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3744631434' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:01:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:37 np0005548918 nova_compute[229246]: 2025-12-06 10:01:37.108 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:01:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100137 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:01:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:37 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:37 np0005548918 podman[230022]: 2025-12-06 10:01:37.193589518 +0000 UTC m=+0.072452049 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  6 05:01:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:37 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:37 np0005548918 nova_compute[229246]: 2025-12-06 10:01:37.281 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:01:37 np0005548918 nova_compute[229246]: 2025-12-06 10:01:37.283 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5207MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:01:37 np0005548918 nova_compute[229246]: 2025-12-06 10:01:37.283 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:01:37 np0005548918 nova_compute[229246]: 2025-12-06 10:01:37.283 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:01:37 np0005548918 nova_compute[229246]: 2025-12-06 10:01:37.540 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:01:37 np0005548918 nova_compute[229246]: 2025-12-06 10:01:37.540 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:01:37 np0005548918 nova_compute[229246]: 2025-12-06 10:01:37.570 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:01:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:01:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:37.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:01:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:01:38 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1184462004' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:01:38 np0005548918 nova_compute[229246]: 2025-12-06 10:01:38.033 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:01:38 np0005548918 nova_compute[229246]: 2025-12-06 10:01:38.039 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:01:38 np0005548918 nova_compute[229246]: 2025-12-06 10:01:38.072 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:01:38 np0005548918 nova_compute[229246]: 2025-12-06 10:01:38.073 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:01:38 np0005548918 nova_compute[229246]: 2025-12-06 10:01:38.074 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.790s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:01:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:38.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:38 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c0048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:39 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:39 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:39.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:39 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:40.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:40 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:41 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c0048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:41 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:41.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:42.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:42 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:43 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:43 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:43.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:01:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:44.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:01:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:44 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:45 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:45 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:45.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:45 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:01:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:01:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:46.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:01:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:46 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:47 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:47 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:01:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:47.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:01:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:48.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:48 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:48 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:01:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:48 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:01:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:49 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:49 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004940 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:49.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:49 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:50.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:50 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:51 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:51 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:51.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:51 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:01:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:01:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:52.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:01:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:52 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004960 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:53 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:53 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:01:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:53.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:01:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:01:53.667 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:01:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:01:53.667 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:01:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:01:53.667 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:01:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:54.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:54 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:54 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Dec  6 05:01:55 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2413463250' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec  6 05:01:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:55 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:55 np0005548918 podman[230110]: 2025-12-06 10:01:55.217304333 +0000 UTC m=+0.106046353 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller)
Dec  6 05:01:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:55 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:55.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:01:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:56.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:01:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:56 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100157 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:01:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:57 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c0049a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:01:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:57.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:01:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:58.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:58 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:01:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:01:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:59 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:01:59 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:01:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:01:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:01:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:59.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:01:59 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:01:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:01:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:00.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Dec  6 05:02:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:00 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c0049c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Dec  6 05:02:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Dec  6 05:02:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Dec  6 05:02:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Dec  6 05:02:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Dec  6 05:02:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Dec  6 05:02:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Dec  6 05:02:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:01 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:01 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:02:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:01.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:02:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.002000054s ======
Dec  6 05:02:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:02.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Dec  6 05:02:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:02 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:03 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c0049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:03 np0005548918 podman[230143]: 2025-12-06 10:02:03.169264595 +0000 UTC m=+0.056530967 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:02:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:03 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:02:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:03.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:02:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:04.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:04 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:04 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:05 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:05 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004a00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:02:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:05.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:02:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:06.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:06 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:07 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:07 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:02:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:07.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:02:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:08 np0005548918 podman[230169]: 2025-12-06 10:02:08.161967259 +0000 UTC m=+0.053687550 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  6 05:02:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:08.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:08 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:09 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:09 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:09.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:09 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:02:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:10.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:02:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:10 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:11 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:11 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:11.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:12.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:12 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:13 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5444004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:13 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:13 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Dec  6 05:02:13 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2729948875' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec  6 05:02:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:13.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:14.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:14 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:14 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:15 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f546c004aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:15 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:15.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:02:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:16.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:02:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:16 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004af0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:17 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:17 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:17.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:18.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:18 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:19 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:19 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:02:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:19.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:02:19 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:20.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:20 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:21 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:21 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:02:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:21.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:02:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:02:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:22.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:02:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:22 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:23 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:23 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:23.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:24.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:24 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:24 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:25 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:25 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:25.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:26 np0005548918 podman[230234]: 2025-12-06 10:02:26.186054234 +0000 UTC m=+0.067557677 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible)
Dec  6 05:02:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:02:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:26.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:02:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:26 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:27 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:27 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:02:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:27.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:02:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:28.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:28 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:29 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:29 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:29.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:30.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:30 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f545c004b90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:31 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:31 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5474001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:02:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:02:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:02:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:02:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:02:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:31.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:02:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:02:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:32.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:02:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:32 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:33 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:33 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:33.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:34.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:34 np0005548918 podman[230375]: 2025-12-06 10:02:34.201657165 +0000 UTC m=+0.088270650 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  6 05:02:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:34 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f543c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:35 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454004490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:35 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:02:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:35.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:02:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:02:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:36.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:02:36 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:02:36 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:02:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:36 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:37 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f543c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:37 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454004490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:02:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:37.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:02:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.065 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.066 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.082 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.082 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.083 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.083 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.083 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.107 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.108 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.108 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.108 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.109 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:02:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:02:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:38.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:02:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:38 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:02:38 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2540342928' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.606 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.769 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.770 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5240MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.771 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.771 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:02:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.876 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.876 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:02:38 np0005548918 nova_compute[229246]: 2025-12-06 10:02:38.907 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:02:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:39 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:39 np0005548918 podman[230467]: 2025-12-06 10:02:39.195873948 +0000 UTC m=+0.082870344 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  6 05:02:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:39 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f543c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:39 np0005548918 nova_compute[229246]: 2025-12-06 10:02:39.396 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:02:39 np0005548918 nova_compute[229246]: 2025-12-06 10:02:39.402 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:02:39 np0005548918 nova_compute[229246]: 2025-12-06 10:02:39.420 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:02:39 np0005548918 nova_compute[229246]: 2025-12-06 10:02:39.421 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:02:39 np0005548918 nova_compute[229246]: 2025-12-06 10:02:39.422 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.651s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:02:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:39.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:39 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:39 np0005548918 nova_compute[229246]: 2025-12-06 10:02:39.874 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:39 np0005548918 nova_compute[229246]: 2025-12-06 10:02:39.875 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:02:39 np0005548918 nova_compute[229246]: 2025-12-06 10:02:39.875 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:02:39 np0005548918 nova_compute[229246]: 2025-12-06 10:02:39.896 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:02:39 np0005548918 nova_compute[229246]: 2025-12-06 10:02:39.897 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:39 np0005548918 nova_compute[229246]: 2025-12-06 10:02:39.897 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:39 np0005548918 nova_compute[229246]: 2025-12-06 10:02:39.897 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:02:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:40.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:40 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454004490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:41 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:41 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5448002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:02:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:41.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:02:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:02:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:42.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:02:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:42 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f543c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:43 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5454004490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:02:43 np0005548918 kernel: ganesha.nfsd[230074]: segfault at 50 ip 00007f552815a32e sp 00007f54e97f9210 error 4 in libntirpc.so.5.8[7f552813f000+2c000] likely on CPU 2 (core 0, socket 2)
Dec  6 05:02:43 np0005548918 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 05:02:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[203590]: 06/12/2025 10:02:43 : epoch 6933fe19 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5450004920 fd 38 proxy ignored for local
Dec  6 05:02:43 np0005548918 systemd[1]: Started Process Core Dump (PID 230495/UID 0).
Dec  6 05:02:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:43.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:44.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:44 np0005548918 systemd-coredump[230496]: Process 203594 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 69:#012#0  0x00007f552815a32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 05:02:44 np0005548918 systemd[1]: systemd-coredump@4-230495-0.service: Deactivated successfully.
Dec  6 05:02:44 np0005548918 systemd[1]: systemd-coredump@4-230495-0.service: Consumed 1.119s CPU time.
Dec  6 05:02:44 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 05:02:44 np0005548918 podman[230503]: 2025-12-06 10:02:44.531883545 +0000 UTC m=+0.030962866 container died b5d95f66151d04075342e932f946282ba7899cc2664931d61ab4171c151f756b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid)
Dec  6 05:02:44 np0005548918 systemd[1]: var-lib-containers-storage-overlay-07dede252d6d4d715c4c6238c1928f24d65a663b0e0d5407b5d663a6ecfde2d8-merged.mount: Deactivated successfully.
Dec  6 05:02:44 np0005548918 podman[230503]: 2025-12-06 10:02:44.576335683 +0000 UTC m=+0.075414944 container remove b5d95f66151d04075342e932f946282ba7899cc2664931d61ab4171c151f756b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 05:02:44 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Main process exited, code=exited, status=139/n/a
Dec  6 05:02:44 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Failed with result 'exit-code'.
Dec  6 05:02:44 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.770s CPU time.
Dec  6 05:02:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:02:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:45.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:02:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  6 05:02:45 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/795001534' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  6 05:02:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  6 05:02:45 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/795001534' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  6 05:02:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:02:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:46.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:02:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:47.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:48.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100249 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:02:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:49.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:49 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:50.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:51.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:02:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:52.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:02:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:02:53.667 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:02:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:02:53.668 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:02:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:02:53.668 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:02:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:53.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:54.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:54 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Scheduled restart job, restart counter is at 5.
Dec  6 05:02:54 np0005548918 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:02:54 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.770s CPU time.
Dec  6 05:02:54 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:54 np0005548918 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 05:02:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:55 np0005548918 podman[230633]: 2025-12-06 10:02:55.148564368 +0000 UTC m=+0.053396460 container create 41f44ee11093096c63d2c40d7eb65f7a6876e7d1de38280f05a7f47084cfc40f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec  6 05:02:55 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88e29f6e0ac3dd4ac24547850ccdea9055cd8cf7f2390af8653b0f5bed6a4ff9/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 05:02:55 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88e29f6e0ac3dd4ac24547850ccdea9055cd8cf7f2390af8653b0f5bed6a4ff9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 05:02:55 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88e29f6e0ac3dd4ac24547850ccdea9055cd8cf7f2390af8653b0f5bed6a4ff9/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:02:55 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88e29f6e0ac3dd4ac24547850ccdea9055cd8cf7f2390af8653b0f5bed6a4ff9/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.sseuqb-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:02:55 np0005548918 podman[230633]: 2025-12-06 10:02:55.204962828 +0000 UTC m=+0.109794920 container init 41f44ee11093096c63d2c40d7eb65f7a6876e7d1de38280f05a7f47084cfc40f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec  6 05:02:55 np0005548918 podman[230633]: 2025-12-06 10:02:55.210262061 +0000 UTC m=+0.115094133 container start 41f44ee11093096c63d2c40d7eb65f7a6876e7d1de38280f05a7f47084cfc40f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Dec  6 05:02:55 np0005548918 bash[230633]: 41f44ee11093096c63d2c40d7eb65f7a6876e7d1de38280f05a7f47084cfc40f
Dec  6 05:02:55 np0005548918 podman[230633]: 2025-12-06 10:02:55.132495245 +0000 UTC m=+0.037327337 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 05:02:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:02:55 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 05:02:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:02:55 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 05:02:55 np0005548918 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:02:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:02:55 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 05:02:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:02:55 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 05:02:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:02:55 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 05:02:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:02:55 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 05:02:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:02:55 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 05:02:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:02:55 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:02:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:55.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:56.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:57 np0005548918 podman[230690]: 2025-12-06 10:02:57.225260948 +0000 UTC m=+0.116424710 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec  6 05:02:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:02:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:57.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:02:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:02:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:58.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:02:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:02:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:02:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:02:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:02:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:59.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:02:59 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:02:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:02:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:00.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:01 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:03:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:01 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:03:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:01.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:02.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:03.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:04.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:04 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:05 np0005548918 podman[230723]: 2025-12-06 10:03:05.167935419 +0000 UTC m=+0.052859906 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:03:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:03:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:05.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:03:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:06.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:03:06.962 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:03:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:03:06.964 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:03:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:03:06.964 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:03:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:07.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:08.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:08 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:09 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:09 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:03:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:09.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:03:09 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:10 np0005548918 podman[230764]: 2025-12-06 10:03:10.205366568 +0000 UTC m=+0.076768540 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  6 05:03:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:10.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:10 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:11 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100311 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:03:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:11 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:03:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:11.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:03:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:03:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:12.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:03:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:12 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:13 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:13 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:13.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:14.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:14 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:14 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:15 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94980016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:15 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:03:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:15.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:03:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:03:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:16.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:03:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:16 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:17 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:17 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:03:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:17.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:03:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:18.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:18 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:19 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:19 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:03:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:19.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:03:19 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:20.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:20 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:21 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:21 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8009d80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:21.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:22.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:22 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:23 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:23 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:23.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:24.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:24 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8009d80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:24 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:25 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:25 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94980032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:25.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:03:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:26.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:03:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:26 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:27 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:27 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8009d80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:27.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:28 np0005548918 podman[230827]: 2025-12-06 10:03:28.195269029 +0000 UTC m=+0.087289584 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  6 05:03:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:28.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:28 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94980032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:29 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:29 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:29.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:30.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:30 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8009f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:31 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:31 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:31.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:32.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:32 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:33 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8009f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:33 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:33.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:34.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:34 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:35 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:35 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8009f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:03:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:35.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:03:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:35 np0005548918 podman[230910]: 2025-12-06 10:03:35.902490513 +0000 UTC m=+0.054544701 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec  6 05:03:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:36.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:36 np0005548918 nova_compute[229246]: 2025-12-06 10:03:36.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:36 np0005548918 nova_compute[229246]: 2025-12-06 10:03:36.537 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:36 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:36 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:03:36 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:03:36 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:03:36 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:03:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:37 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:37 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:37 np0005548918 nova_compute[229246]: 2025-12-06 10:03:37.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:37 np0005548918 nova_compute[229246]: 2025-12-06 10:03:37.561 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:03:37 np0005548918 nova_compute[229246]: 2025-12-06 10:03:37.561 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:03:37 np0005548918 nova_compute[229246]: 2025-12-06 10:03:37.562 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:03:37 np0005548918 nova_compute[229246]: 2025-12-06 10:03:37.562 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:03:37 np0005548918 nova_compute[229246]: 2025-12-06 10:03:37.562 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:03:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:37.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:37 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:03:37 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2255593849' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:03:38 np0005548918 nova_compute[229246]: 2025-12-06 10:03:38.006 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:03:38 np0005548918 nova_compute[229246]: 2025-12-06 10:03:38.148 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:03:38 np0005548918 nova_compute[229246]: 2025-12-06 10:03:38.150 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5253MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:03:38 np0005548918 nova_compute[229246]: 2025-12-06 10:03:38.150 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:03:38 np0005548918 nova_compute[229246]: 2025-12-06 10:03:38.150 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:03:38 np0005548918 nova_compute[229246]: 2025-12-06 10:03:38.206 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:03:38 np0005548918 nova_compute[229246]: 2025-12-06 10:03:38.207 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:03:38 np0005548918 nova_compute[229246]: 2025-12-06 10:03:38.223 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:03:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:03:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:38.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:03:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:38 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8009f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:03:38 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2555214126' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:03:38 np0005548918 nova_compute[229246]: 2025-12-06 10:03:38.660 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:03:38 np0005548918 nova_compute[229246]: 2025-12-06 10:03:38.666 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:03:38 np0005548918 nova_compute[229246]: 2025-12-06 10:03:38.689 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:03:38 np0005548918 nova_compute[229246]: 2025-12-06 10:03:38.692 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:03:38 np0005548918 nova_compute[229246]: 2025-12-06 10:03:38.692 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:03:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:39 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:39 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:39 np0005548918 nova_compute[229246]: 2025-12-06 10:03:39.687 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:39 np0005548918 nova_compute[229246]: 2025-12-06 10:03:39.687 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:39 np0005548918 nova_compute[229246]: 2025-12-06 10:03:39.687 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:03:39 np0005548918 nova_compute[229246]: 2025-12-06 10:03:39.687 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:03:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:39.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:39 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:39 np0005548918 nova_compute[229246]: 2025-12-06 10:03:39.962 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:03:39 np0005548918 nova_compute[229246]: 2025-12-06 10:03:39.963 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:39 np0005548918 nova_compute[229246]: 2025-12-06 10:03:39.963 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:39 np0005548918 nova_compute[229246]: 2025-12-06 10:03:39.963 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:39 np0005548918 nova_compute[229246]: 2025-12-06 10:03:39.963 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:03:39 np0005548918 nova_compute[229246]: 2025-12-06 10:03:39.963 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:03:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:40.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:40 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:41 np0005548918 podman[231034]: 2025-12-06 10:03:41.171123194 +0000 UTC m=+0.057018328 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  6 05:03:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:41 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8009f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:41 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94b4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:41.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:42.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:42 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:03:42 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:03:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:42 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:43 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:43 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8009f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:43.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:44.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:44 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94b40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:45 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:45 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a8000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:45.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  6 05:03:45 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2191484556' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  6 05:03:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  6 05:03:45 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2191484556' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  6 05:03:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:46.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:46 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:47 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94b40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:47 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:47.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:48.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:48 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a8001aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:49 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:49 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:03:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:49.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:03:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:50.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:50 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:51 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a8001aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:51 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:51.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:52.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:52 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94b4002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:53 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:53 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80027b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:03:53.669 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:03:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:03:53.670 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:03:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:03:53.670 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:03:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:53.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:54.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:54 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:03:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:55 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94b4002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:55 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:03:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:55.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:03:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:56.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:56 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a8002930 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:57 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:57 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94b4002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:03:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:57.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:03:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:03:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:58.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:03:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:58 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:59 np0005548918 podman[231123]: 2025-12-06 10:03:59.20719645 +0000 UTC m=+0.097503458 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  6 05:03:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:03:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:03:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:59 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a8003250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:03:59 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:03:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:03:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:03:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:59.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:03:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:03:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:00.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:00 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94b4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:01 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:01 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a8003250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:01.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:02.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:02 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:03 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94b4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:03 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:03.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:04.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:04 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a8003250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:04.803556) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444803626, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2347, "num_deletes": 251, "total_data_size": 6048622, "memory_usage": 6142176, "flush_reason": "Manual Compaction"}
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444839627, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 3957290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20842, "largest_seqno": 23184, "table_properties": {"data_size": 3947958, "index_size": 5826, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19302, "raw_average_key_size": 20, "raw_value_size": 3929213, "raw_average_value_size": 4084, "num_data_blocks": 257, "num_entries": 962, "num_filter_entries": 962, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015229, "oldest_key_time": 1765015229, "file_creation_time": 1765015444, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 36119 microseconds, and 15290 cpu microseconds.
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:04.839677) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 3957290 bytes OK
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:04.839696) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:04.841405) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:04.841416) EVENT_LOG_v1 {"time_micros": 1765015444841413, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:04.841431) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6038218, prev total WAL file size 6038218, number of live WAL files 2.
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:04.842744) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(3864KB)], [39(13MB)]
Dec  6 05:04:04 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444842836, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 17911806, "oldest_snapshot_seqno": -1}
Dec  6 05:04:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5481 keys, 15736627 bytes, temperature: kUnknown
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015445011170, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 15736627, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15697138, "index_size": 24659, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13765, "raw_key_size": 138109, "raw_average_key_size": 25, "raw_value_size": 15595066, "raw_average_value_size": 2845, "num_data_blocks": 1018, "num_entries": 5481, "num_filter_entries": 5481, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765015444, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:05.011546) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 15736627 bytes
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:05.013518) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 106.3 rd, 93.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 13.3 +0.0 blob) out(15.0 +0.0 blob), read-write-amplify(8.5) write-amplify(4.0) OK, records in: 5997, records dropped: 516 output_compression: NoCompression
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:05.013546) EVENT_LOG_v1 {"time_micros": 1765015445013534, "job": 22, "event": "compaction_finished", "compaction_time_micros": 168490, "compaction_time_cpu_micros": 28393, "output_level": 6, "num_output_files": 1, "total_output_size": 15736627, "num_input_records": 5997, "num_output_records": 5481, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:04.842568) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:05.013650) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:05.013923) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:05.013928) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:05.013931) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:05.013933) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015445015039, "job": 0, "event": "table_file_deletion", "file_number": 41}
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015445017476, "job": 0, "event": "table_file_deletion", "file_number": 39}
Dec  6 05:04:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:05 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:05 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94b4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:05.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:06 np0005548918 podman[231157]: 2025-12-06 10:04:06.155302332 +0000 UTC m=+0.047706127 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  6 05:04:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:06.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:06 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:04:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:07.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:04:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:08.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:08 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94b4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:09 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:09 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:09.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:04:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:10.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:04:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:10 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:11 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94b4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:11 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:11.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:12 np0005548918 podman[231207]: 2025-12-06 10:04:12.198786909 +0000 UTC m=+0.084710104 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  6 05:04:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:12.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:12 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:13 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:13 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:13.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:04:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:14.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:04:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:14 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:15 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:15 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:15.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:16.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:16 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:17 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:17 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:17.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:04:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:18.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:04:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:18 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:19 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:19 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:19.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:04:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:20.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:04:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:20 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:21 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:21 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:04:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:21.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:04:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:22.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:22 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:23 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:23 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:23.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:24.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:24 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:25 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9490000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:25 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:25.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:26.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:26 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:27 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:27 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:04:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:27.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:04:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:28.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:28 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:29 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:29 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:04:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:29.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:04:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:30 np0005548918 podman[231248]: 2025-12-06 10:04:30.216407221 +0000 UTC m=+0.102480624 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  6 05:04:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:30.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:30 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:31 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:31 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:31.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:04:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:32.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:04:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:32 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:33 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:33 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:33.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:34.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:34 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:35 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94900016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:35 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c002b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:04:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:35.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:04:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:04:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:36.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:04:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:36 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:37 np0005548918 podman[231306]: 2025-12-06 10:04:37.166668212 +0000 UTC m=+0.055121387 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:04:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:37 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:37 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9490002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:37 np0005548918 nova_compute[229246]: 2025-12-06 10:04:37.537 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.724765) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477724791, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 533, "num_deletes": 251, "total_data_size": 842271, "memory_usage": 852080, "flush_reason": "Manual Compaction"}
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477730500, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 393629, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23190, "largest_seqno": 23717, "table_properties": {"data_size": 391065, "index_size": 600, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6539, "raw_average_key_size": 19, "raw_value_size": 385955, "raw_average_value_size": 1148, "num_data_blocks": 27, "num_entries": 336, "num_filter_entries": 336, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015445, "oldest_key_time": 1765015445, "file_creation_time": 1765015477, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 5773 microseconds, and 1807 cpu microseconds.
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.730537) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 393629 bytes OK
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.730555) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.732346) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.732360) EVENT_LOG_v1 {"time_micros": 1765015477732356, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.732374) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 839177, prev total WAL file size 839177, number of live WAL files 2.
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.732831) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353030' seq:72057594037927935, type:22 .. '6D67727374617400373532' seq:0, type:0; will stop at (end)
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(384KB)], [42(15MB)]
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477732885, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 16130256, "oldest_snapshot_seqno": -1}
Dec  6 05:04:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:37.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5317 keys, 12216201 bytes, temperature: kUnknown
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477879891, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 12216201, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12182102, "index_size": 19717, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13317, "raw_key_size": 135080, "raw_average_key_size": 25, "raw_value_size": 12087114, "raw_average_value_size": 2273, "num_data_blocks": 802, "num_entries": 5317, "num_filter_entries": 5317, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765015477, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.880275) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 12216201 bytes
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.881866) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 109.7 rd, 83.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 15.0 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(72.0) write-amplify(31.0) OK, records in: 5817, records dropped: 500 output_compression: NoCompression
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.881905) EVENT_LOG_v1 {"time_micros": 1765015477881889, "job": 24, "event": "compaction_finished", "compaction_time_micros": 147104, "compaction_time_cpu_micros": 24255, "output_level": 6, "num_output_files": 1, "total_output_size": 12216201, "num_input_records": 5817, "num_output_records": 5317, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477882635, "job": 24, "event": "table_file_deletion", "file_number": 44}
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477886518, "job": 24, "event": "table_file_deletion", "file_number": 42}
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.732741) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.886666) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.886670) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.886672) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.886673) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:37 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:04:37.886675) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:04:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:04:38 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3692914369' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:04:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:04:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:38.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:04:38 np0005548918 nova_compute[229246]: 2025-12-06 10:04:38.531 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:38 np0005548918 nova_compute[229246]: 2025-12-06 10:04:38.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:38 np0005548918 nova_compute[229246]: 2025-12-06 10:04:38.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:38 np0005548918 nova_compute[229246]: 2025-12-06 10:04:38.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:38 np0005548918 nova_compute[229246]: 2025-12-06 10:04:38.554 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:04:38 np0005548918 nova_compute[229246]: 2025-12-06 10:04:38.554 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:04:38 np0005548918 nova_compute[229246]: 2025-12-06 10:04:38.555 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:04:38 np0005548918 nova_compute[229246]: 2025-12-06 10:04:38.555 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:04:38 np0005548918 nova_compute[229246]: 2025-12-06 10:04:38.555 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:04:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:38 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c0044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=404 latency=0.002000053s ======
Dec  6 05:04:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:38.743 +0000] "GET /healthcheck HTTP/1.1" 404 242 - "python-urllib3/1.26.5" - latency=0.002000053s
Dec  6 05:04:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:39 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:04:39 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1352342801' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:04:39 np0005548918 nova_compute[229246]: 2025-12-06 10:04:39.058 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:04:39 np0005548918 nova_compute[229246]: 2025-12-06 10:04:39.262 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:04:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:39 np0005548918 nova_compute[229246]: 2025-12-06 10:04:39.263 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5225MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:04:39 np0005548918 nova_compute[229246]: 2025-12-06 10:04:39.263 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:04:39 np0005548918 nova_compute[229246]: 2025-12-06 10:04:39.264 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:04:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:39 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c0044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:39 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c0044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:39 np0005548918 nova_compute[229246]: 2025-12-06 10:04:39.336 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:04:39 np0005548918 nova_compute[229246]: 2025-12-06 10:04:39.336 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:04:39 np0005548918 nova_compute[229246]: 2025-12-06 10:04:39.360 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:04:39 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:04:39 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1282252340' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:04:39 np0005548918 nova_compute[229246]: 2025-12-06 10:04:39.794 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:04:39 np0005548918 nova_compute[229246]: 2025-12-06 10:04:39.802 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:04:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:39.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:39 np0005548918 nova_compute[229246]: 2025-12-06 10:04:39.824 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:04:39 np0005548918 nova_compute[229246]: 2025-12-06 10:04:39.826 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:04:39 np0005548918 nova_compute[229246]: 2025-12-06 10:04:39.826 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.562s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:04:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:40.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:40 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9490002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:40 np0005548918 nova_compute[229246]: 2025-12-06 10:04:40.828 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:40 np0005548918 nova_compute[229246]: 2025-12-06 10:04:40.846 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:40 np0005548918 nova_compute[229246]: 2025-12-06 10:04:40.846 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:40 np0005548918 nova_compute[229246]: 2025-12-06 10:04:40.847 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:40 np0005548918 nova_compute[229246]: 2025-12-06 10:04:40.848 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:04:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:41 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:41 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:41 np0005548918 nova_compute[229246]: 2025-12-06 10:04:41.537 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:04:41 np0005548918 nova_compute[229246]: 2025-12-06 10:04:41.538 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:04:41 np0005548918 nova_compute[229246]: 2025-12-06 10:04:41.538 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:04:41 np0005548918 nova_compute[229246]: 2025-12-06 10:04:41.561 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:04:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:41.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:42.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:42 np0005548918 podman[231497]: 2025-12-06 10:04:42.403902382 +0000 UTC m=+0.082677246 container exec 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  6 05:04:42 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  6 05:04:42 np0005548918 podman[231497]: 2025-12-06 10:04:42.542518431 +0000 UTC m=+0.221293265 container exec_died 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 05:04:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:42 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c0044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:42 np0005548918 podman[231530]: 2025-12-06 10:04:42.686408083 +0000 UTC m=+0.089714749 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd)
Dec  6 05:04:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:42 np0005548918 podman[231636]: 2025-12-06 10:04:42.992050438 +0000 UTC m=+0.061112866 container exec 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 05:04:43 np0005548918 podman[231636]: 2025-12-06 10:04:43.024267381 +0000 UTC m=+0.093329819 container exec_died 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 05:04:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:43 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c0044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:43 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c0044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:43 np0005548918 podman[231725]: 2025-12-06 10:04:43.359838625 +0000 UTC m=+0.051089591 container exec 41f44ee11093096c63d2c40d7eb65f7a6876e7d1de38280f05a7f47084cfc40f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 05:04:43 np0005548918 podman[231725]: 2025-12-06 10:04:43.370747784 +0000 UTC m=+0.061998730 container exec_died 41f44ee11093096c63d2c40d7eb65f7a6876e7d1de38280f05a7f47084cfc40f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 05:04:43 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e147 e147: 3 total, 3 up, 3 in
Dec  6 05:04:43 np0005548918 podman[231787]: 2025-12-06 10:04:43.566344784 +0000 UTC m=+0.042710181 container exec 291e33d7558df1250bc1d75586903aba6000ccad9dd3cb120f4999944db31c98 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna)
Dec  6 05:04:43 np0005548918 podman[231787]: 2025-12-06 10:04:43.575412052 +0000 UTC m=+0.051777459 container exec_died 291e33d7558df1250bc1d75586903aba6000ccad9dd3cb120f4999944db31c98 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna)
Dec  6 05:04:43 np0005548918 podman[231854]: 2025-12-06 10:04:43.757660496 +0000 UTC m=+0.049588390 container exec cbcabdb9b139bf7198b15438accb8f4a51fb667fdf4f19be3cdf7b28a8213220 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg, distribution-scope=public, name=keepalived, version=2.2.4, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, release=1793, vcs-type=git, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec  6 05:04:43 np0005548918 podman[231854]: 2025-12-06 10:04:43.785534739 +0000 UTC m=+0.077462613 container exec_died cbcabdb9b139bf7198b15438accb8f4a51fb667fdf4f19be3cdf7b28a8213220 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, architecture=x86_64, name=keepalived, release=1793, version=2.2.4, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Dec  6 05:04:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:04:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:43.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:04:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:04:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:44.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:04:44 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:44 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:44 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:44 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e148 e148: 3 total, 3 up, 3 in
Dec  6 05:04:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:44 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9490002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:45 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:45 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:45 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  6 05:04:45 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  6 05:04:45 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:04:45 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:45 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:45 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:04:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e149 e149: 3 total, 3 up, 3 in
Dec  6 05:04:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:45.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:04:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:46.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:04:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:46 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c0044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:47 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9490002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:47 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:04:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:47.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:04:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:48 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e150 e150: 3 total, 3 up, 3 in
Dec  6 05:04:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:48.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:48 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a4003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:49 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c0044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:49 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c0044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:49.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:50.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:50 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:50 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:50 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:04:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:51 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:51 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94900036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:51.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:52.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:52 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c0044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:52 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e151 e151: 3 total, 3 up, 3 in
Dec  6 05:04:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:53 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c0044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:53 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f949c0044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:04:53.670 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:04:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:04:53.671 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:04:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:04:53.671 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:04:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:53.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:04:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:54.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:04:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:54 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94900036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:04:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:55 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:55 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:04:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:55.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:04:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:56.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:56 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a40040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:57 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94900036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:57 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:04:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:57.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:04:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:04:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:58.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:04:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:58 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:04:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:04:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:59 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a40040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:04:59 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94900036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:04:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:04:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:04:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:59.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:04:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:04:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:00.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:00 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:01 np0005548918 podman[232071]: 2025-12-06 10:05:01.249702414 +0000 UTC m=+0.122446496 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  6 05:05:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:01 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94c8008de0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:01 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a40040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:05:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:01.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:05:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:02.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:02 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94900036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:03 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:03 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:03.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  6 05:05:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:04.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  6 05:05:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:04 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a40040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:05 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94900036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:05 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:05.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:06.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:06 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a40040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:07 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94900036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:07.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:08 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:05:08.047 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:05:08 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:05:08.048 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:05:08 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:05:08.050 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:05:08 np0005548918 podman[232106]: 2025-12-06 10:05:08.177996392 +0000 UTC m=+0.064973231 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  6 05:05:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:05:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:08.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:05:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:08 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a80042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:09 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9498001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[230647]: 06/12/2025 10:05:09 : epoch 6933ff4f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f94a40040d0 fd 38 proxy ignored for local
Dec  6 05:05:09 np0005548918 kernel: ganesha.nfsd[230760]: segfault at 50 ip 00007f957545a32e sp 00007f952dffa210 error 4 in libntirpc.so.5.8[7f957543f000+2c000] likely on CPU 7 (core 0, socket 7)
Dec  6 05:05:09 np0005548918 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 05:05:09 np0005548918 systemd[1]: Started Process Core Dump (PID 232126/UID 0).
Dec  6 05:05:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:05:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:09.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:05:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100510 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:05:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:05:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:10.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:05:10 np0005548918 systemd-coredump[232127]: Process 230651 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 53:#012#0  0x00007f957545a32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 05:05:10 np0005548918 systemd[1]: systemd-coredump@5-232126-0.service: Deactivated successfully.
Dec  6 05:05:10 np0005548918 systemd[1]: systemd-coredump@5-232126-0.service: Consumed 1.294s CPU time.
Dec  6 05:05:10 np0005548918 podman[232133]: 2025-12-06 10:05:10.835961602 +0000 UTC m=+0.043132074 container died 41f44ee11093096c63d2c40d7eb65f7a6876e7d1de38280f05a7f47084cfc40f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Dec  6 05:05:10 np0005548918 systemd[1]: var-lib-containers-storage-overlay-88e29f6e0ac3dd4ac24547850ccdea9055cd8cf7f2390af8653b0f5bed6a4ff9-merged.mount: Deactivated successfully.
Dec  6 05:05:10 np0005548918 podman[232133]: 2025-12-06 10:05:10.877482288 +0000 UTC m=+0.084652760 container remove 41f44ee11093096c63d2c40d7eb65f7a6876e7d1de38280f05a7f47084cfc40f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec  6 05:05:10 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Main process exited, code=exited, status=139/n/a
Dec  6 05:05:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:11 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Failed with result 'exit-code'.
Dec  6 05:05:11 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.691s CPU time.
Dec  6 05:05:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:11.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:12.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:13 np0005548918 podman[232203]: 2025-12-06 10:05:13.17859227 +0000 UTC m=+0.069489475 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:05:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:13.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:05:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:14.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:05:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100515 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:05:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:05:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:15.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:05:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:16.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:17.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:18.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:19.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:20.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:21 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Scheduled restart job, restart counter is at 6.
Dec  6 05:05:21 np0005548918 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:05:21 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.691s CPU time.
Dec  6 05:05:21 np0005548918 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 05:05:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:21 np0005548918 podman[232281]: 2025-12-06 10:05:21.287461148 +0000 UTC m=+0.038802884 container create 54ee4eab8c62c38b4bde619ddd33181c6187b2f7e5025eaebc667b3481ee5955 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 05:05:21 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccc7b516cfb488e4a02550397d0293a2c7677aa88723e02dc64e3567916ebb90/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 05:05:21 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccc7b516cfb488e4a02550397d0293a2c7677aa88723e02dc64e3567916ebb90/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 05:05:21 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccc7b516cfb488e4a02550397d0293a2c7677aa88723e02dc64e3567916ebb90/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:05:21 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccc7b516cfb488e4a02550397d0293a2c7677aa88723e02dc64e3567916ebb90/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.sseuqb-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:05:21 np0005548918 podman[232281]: 2025-12-06 10:05:21.348757197 +0000 UTC m=+0.100098953 container init 54ee4eab8c62c38b4bde619ddd33181c6187b2f7e5025eaebc667b3481ee5955 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Dec  6 05:05:21 np0005548918 podman[232281]: 2025-12-06 10:05:21.35686306 +0000 UTC m=+0.108204796 container start 54ee4eab8c62c38b4bde619ddd33181c6187b2f7e5025eaebc667b3481ee5955 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 05:05:21 np0005548918 bash[232281]: 54ee4eab8c62c38b4bde619ddd33181c6187b2f7e5025eaebc667b3481ee5955
Dec  6 05:05:21 np0005548918 podman[232281]: 2025-12-06 10:05:21.27182938 +0000 UTC m=+0.023171136 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 05:05:21 np0005548918 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:05:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:21 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 05:05:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:21 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 05:05:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:21 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 05:05:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:21 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 05:05:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:21 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 05:05:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:21 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 05:05:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:21 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 05:05:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:21 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:05:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:21.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:22.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:23.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:05:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:24.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:05:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:25.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:26.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:27 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec  6 05:05:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:27 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec  6 05:05:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:27 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:05:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:27 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:05:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:27 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:05:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:27.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:28.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:29 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:05:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:29 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:05:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:29 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:05:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:29 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:05:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:29 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:05:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:29 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:05:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:29 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:05:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:29.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:30.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:31 np0005548918 podman[232372]: 2025-12-06 10:05:31.440104485 +0000 UTC m=+0.127804353 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:05:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:05:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:31.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:05:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100532 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:05:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:32.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:05:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:33.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:05:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:34.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000018:nfs.cephfs.1: -2
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa374000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa350000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e152 e152: 3 total, 3 up, 3 in
Dec  6 05:05:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:05:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:35.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:05:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:36.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:36 np0005548918 nova_compute[229246]: 2025-12-06 10:05:36.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:36 np0005548918 nova_compute[229246]: 2025-12-06 10:05:36.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  6 05:05:36 np0005548918 nova_compute[229246]: 2025-12-06 10:05:36.550 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  6 05:05:36 np0005548918 nova_compute[229246]: 2025-12-06 10:05:36.551 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:36 np0005548918 nova_compute[229246]: 2025-12-06 10:05:36.551 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  6 05:05:36 np0005548918 nova_compute[229246]: 2025-12-06 10:05:36.570 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:36 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e153 e153: 3 total, 3 up, 3 in
Dec  6 05:05:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:36 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100537 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:05:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:37 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa344000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:37 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa358000e00 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:37.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:38.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:38 np0005548918 nova_compute[229246]: 2025-12-06 10:05:38.579 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:38 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa344000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:39 np0005548918 podman[232422]: 2025-12-06 10:05:39.178278834 +0000 UTC m=+0.063223954 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec  6 05:05:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:39 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa374002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:39 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3500016a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:39 np0005548918 nova_compute[229246]: 2025-12-06 10:05:39.530 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:39 np0005548918 nova_compute[229246]: 2025-12-06 10:05:39.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:39 np0005548918 nova_compute[229246]: 2025-12-06 10:05:39.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:39 np0005548918 nova_compute[229246]: 2025-12-06 10:05:39.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:05:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:39.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:40.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:40 np0005548918 nova_compute[229246]: 2025-12-06 10:05:40.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:40 np0005548918 nova_compute[229246]: 2025-12-06 10:05:40.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:40 np0005548918 nova_compute[229246]: 2025-12-06 10:05:40.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:40 np0005548918 nova_compute[229246]: 2025-12-06 10:05:40.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:40 np0005548918 nova_compute[229246]: 2025-12-06 10:05:40.560 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:05:40 np0005548918 nova_compute[229246]: 2025-12-06 10:05:40.560 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:05:40 np0005548918 nova_compute[229246]: 2025-12-06 10:05:40.561 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:05:40 np0005548918 nova_compute[229246]: 2025-12-06 10:05:40.561 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:05:40 np0005548918 nova_compute[229246]: 2025-12-06 10:05:40.561 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:05:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:40 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa358001920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:41 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:05:41 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3766974847' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.054 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.212 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.213 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5230MB free_disk=59.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.213 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.214 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:05:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.324 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.324 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:05:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:41 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:41 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa374002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.387 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing inventories for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.472 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating ProviderTree inventory for provider 31f5f484-bf36-44de-83b8-7b434061a77b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.473 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating inventory in ProviderTree for provider 31f5f484-bf36-44de-83b8-7b434061a77b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.487 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing aggregate associations for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.522 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing trait associations for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE4A,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_ABM,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,HW_CPU_X86_AVX2,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.545 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:05:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:41.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:41 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:05:41 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3852513152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.989 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:05:41 np0005548918 nova_compute[229246]: 2025-12-06 10:05:41.996 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:05:42 np0005548918 nova_compute[229246]: 2025-12-06 10:05:42.010 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:05:42 np0005548918 nova_compute[229246]: 2025-12-06 10:05:42.012 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:05:42 np0005548918 nova_compute[229246]: 2025-12-06 10:05:42.012 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.798s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:05:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:42.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:42 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa350001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:42 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 e154: 3 total, 3 up, 3 in
Dec  6 05:05:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:43 np0005548918 nova_compute[229246]: 2025-12-06 10:05:43.012 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:05:43 np0005548918 nova_compute[229246]: 2025-12-06 10:05:43.013 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:05:43 np0005548918 nova_compute[229246]: 2025-12-06 10:05:43.013 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:05:43 np0005548918 nova_compute[229246]: 2025-12-06 10:05:43.034 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:05:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:43 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:43 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa358002240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:05:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:43.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:05:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:44 np0005548918 podman[232492]: 2025-12-06 10:05:44.185984416 +0000 UTC m=+0.060337194 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:05:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:44.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:44 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3740021b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:45 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa350001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:45 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:45.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:05:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:46.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:05:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:46 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa358002240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  6 05:05:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/850046515' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  6 05:05:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  6 05:05:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/850046515' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  6 05:05:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:47 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa374002350 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:47 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa350001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:05:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:47.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:05:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:48.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:48 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:49 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa358002240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:49 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa374009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:49.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:50.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:50 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3500032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:51 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:51 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:51.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100552 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:05:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:52.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:52 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.803315) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552803334, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 1148, "num_deletes": 256, "total_data_size": 2543514, "memory_usage": 2579568, "flush_reason": "Manual Compaction"}
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552814496, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1680401, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23722, "largest_seqno": 24865, "table_properties": {"data_size": 1675158, "index_size": 2703, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11296, "raw_average_key_size": 19, "raw_value_size": 1664427, "raw_average_value_size": 2864, "num_data_blocks": 118, "num_entries": 581, "num_filter_entries": 581, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015477, "oldest_key_time": 1765015477, "file_creation_time": 1765015552, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 11278 microseconds, and 4220 cpu microseconds.
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.814589) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1680401 bytes OK
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.814607) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.817405) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.817420) EVENT_LOG_v1 {"time_micros": 1765015552817415, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.817434) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2537850, prev total WAL file size 2537850, number of live WAL files 2.
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.818019) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323532' seq:72057594037927935, type:22 .. '6C6F676D00353034' seq:0, type:0; will stop at (end)
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(1641KB)], [45(11MB)]
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552818048, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 13896602, "oldest_snapshot_seqno": -1}
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5364 keys, 13714551 bytes, temperature: kUnknown
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552884593, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13714551, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13678327, "index_size": 21714, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13445, "raw_key_size": 137335, "raw_average_key_size": 25, "raw_value_size": 13580742, "raw_average_value_size": 2531, "num_data_blocks": 884, "num_entries": 5364, "num_filter_entries": 5364, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765015552, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.884846) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13714551 bytes
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.886266) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 208.4 rd, 205.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 11.7 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(16.4) write-amplify(8.2) OK, records in: 5898, records dropped: 534 output_compression: NoCompression
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.886281) EVENT_LOG_v1 {"time_micros": 1765015552886274, "job": 26, "event": "compaction_finished", "compaction_time_micros": 66667, "compaction_time_cpu_micros": 24146, "output_level": 6, "num_output_files": 1, "total_output_size": 13714551, "num_input_records": 5898, "num_output_records": 5364, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552886777, "job": 26, "event": "table_file_deletion", "file_number": 47}
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552888938, "job": 26, "event": "table_file_deletion", "file_number": 45}
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.817951) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.889027) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.889031) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.889033) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.889036) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:05:52 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:05:52.889039) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:05:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:53 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3500032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:53 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa358003590 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:53 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec  6 05:05:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:05:53.671 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:05:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:05:53.673 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:05:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:05:53.673 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:05:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:53.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:54.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:54 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:54 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:05:54 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:05:54 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:05:54 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:05:54 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:05:54 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:05:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:05:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:55 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa37400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:55 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa350004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:55 np0005548918 ceph-mon[75798]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec  6 05:05:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:55.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:05:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:56.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:05:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:56 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa358003590 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:57 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa37400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:57 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:57.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:58.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:58 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa350004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:05:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:05:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:59 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa358003590 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:05:59 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:05:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:05:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:05:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:59.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:05:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:05:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:00.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:00 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:00 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:06:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:01 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa350004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:01 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa350004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:01 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:06:01 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:06:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:01.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:02 np0005548918 podman[232660]: 2025-12-06 10:06:02.23117851 +0000 UTC m=+0.111233579 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec  6 05:06:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:02.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:02 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:03 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:03 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa350004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:03.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:03 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:06:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:03 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:06:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:04.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:04 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa350004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:05 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:05 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:05.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:06.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:06 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa350004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:06 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:06:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:07 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:07 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:07.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:08 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:08.113 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:06:08 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:08.114 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:06:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:08.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:08 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:09 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:06:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:09 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa350004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:09 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:09.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:10 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:10.116 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:06:10 np0005548918 podman[232695]: 2025-12-06 10:06:10.173069652 +0000 UTC m=+0.055033310 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  6 05:06:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:10.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:10 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:11 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:11 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa344000d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:11.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100612 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:06:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:12.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:12 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:12 np0005548918 nova_compute[229246]: 2025-12-06 10:06:12.963 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "a6718b79-80e4-4d66-bff2-537e642a14f9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:12 np0005548918 nova_compute[229246]: 2025-12-06 10:06:12.963 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:12 np0005548918 nova_compute[229246]: 2025-12-06 10:06:12.976 229250 DEBUG nova.compute.manager [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.041 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.042 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.049 229250 DEBUG nova.virt.hardware [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.049 229250 INFO nova.compute.claims [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.128 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:13 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:13 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:13 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:06:13 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4031308316' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.560 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.566 229250 DEBUG nova.compute.provider_tree [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.586 229250 DEBUG nova.scheduler.client.report [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.638 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.639 229250 DEBUG nova.compute.manager [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.710 229250 DEBUG nova.compute.manager [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.711 229250 DEBUG nova.network.neutron [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.739 229250 INFO nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.758 229250 DEBUG nova.compute.manager [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.876 229250 DEBUG nova.compute.manager [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.878 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.879 229250 INFO nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Creating image(s)#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.910 229250 DEBUG nova.storage.rbd_utils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image a6718b79-80e4-4d66-bff2-537e642a14f9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:06:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:13.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.935 229250 DEBUG nova.storage.rbd_utils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image a6718b79-80e4-4d66-bff2-537e642a14f9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.958 229250 DEBUG nova.storage.rbd_utils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image a6718b79-80e4-4d66-bff2-537e642a14f9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.961 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "1b7208203e670301d076a006cb3364d3eb842050" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:13 np0005548918 nova_compute[229246]: 2025-12-06 10:06:13.962 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "1b7208203e670301d076a006cb3364d3eb842050" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:14 np0005548918 nova_compute[229246]: 2025-12-06 10:06:14.279 229250 DEBUG nova.virt.libvirt.imagebackend [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Image locations are: [{'url': 'rbd://5ecd3f74-dade-5fc4-92ce-8950ae424258/images/9489b8a5-a798-4e26-87f9-59bb1eb2e6fd/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://5ecd3f74-dade-5fc4-92ce-8950ae424258/images/9489b8a5-a798-4e26-87f9-59bb1eb2e6fd/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Dec  6 05:06:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:14.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:14 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa344000d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:14 np0005548918 nova_compute[229246]: 2025-12-06 10:06:14.860 229250 WARNING oslo_policy.policy [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Dec  6 05:06:14 np0005548918 nova_compute[229246]: 2025-12-06 10:06:14.861 229250 WARNING oslo_policy.policy [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Dec  6 05:06:14 np0005548918 nova_compute[229246]: 2025-12-06 10:06:14.863 229250 DEBUG nova.policy [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '03615580775245e6ae335ee9d785611f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '92b402c8d3e2476abc98be42a1e6d34e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  6 05:06:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:15 np0005548918 podman[232822]: 2025-12-06 10:06:15.16635393 +0000 UTC m=+0.056352795 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  6 05:06:15 np0005548918 nova_compute[229246]: 2025-12-06 10:06:15.192 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:15 np0005548918 nova_compute[229246]: 2025-12-06 10:06:15.281 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050.part --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:15 np0005548918 nova_compute[229246]: 2025-12-06 10:06:15.283 229250 DEBUG nova.virt.images [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] 9489b8a5-a798-4e26-87f9-59bb1eb2e6fd was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Dec  6 05:06:15 np0005548918 nova_compute[229246]: 2025-12-06 10:06:15.284 229250 DEBUG nova.privsep.utils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  6 05:06:15 np0005548918 nova_compute[229246]: 2025-12-06 10:06:15.285 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050.part /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:15 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:15 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:15 np0005548918 nova_compute[229246]: 2025-12-06 10:06:15.480 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050.part /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050.converted" returned: 0 in 0.195s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:15 np0005548918 nova_compute[229246]: 2025-12-06 10:06:15.490 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:15 np0005548918 nova_compute[229246]: 2025-12-06 10:06:15.561 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050.converted --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:15 np0005548918 nova_compute[229246]: 2025-12-06 10:06:15.563 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "1b7208203e670301d076a006cb3364d3eb842050" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:15 np0005548918 nova_compute[229246]: 2025-12-06 10:06:15.599 229250 DEBUG nova.storage.rbd_utils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image a6718b79-80e4-4d66-bff2-537e642a14f9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:06:15 np0005548918 nova_compute[229246]: 2025-12-06 10:06:15.605 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050 a6718b79-80e4-4d66-bff2-537e642a14f9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:15.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:15 np0005548918 nova_compute[229246]: 2025-12-06 10:06:15.947 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050 a6718b79-80e4-4d66-bff2-537e642a14f9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.342s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:16 np0005548918 nova_compute[229246]: 2025-12-06 10:06:16.023 229250 DEBUG nova.storage.rbd_utils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] resizing rbd image a6718b79-80e4-4d66-bff2-537e642a14f9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Dec  6 05:06:16 np0005548918 nova_compute[229246]: 2025-12-06 10:06:16.133 229250 DEBUG nova.objects.instance [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lazy-loading 'migration_context' on Instance uuid a6718b79-80e4-4d66-bff2-537e642a14f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  6 05:06:16 np0005548918 nova_compute[229246]: 2025-12-06 10:06:16.148 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  6 05:06:16 np0005548918 nova_compute[229246]: 2025-12-06 10:06:16.148 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Ensure instance console log exists: /var/lib/nova/instances/a6718b79-80e4-4d66-bff2-537e642a14f9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  6 05:06:16 np0005548918 nova_compute[229246]: 2025-12-06 10:06:16.149 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:16 np0005548918 nova_compute[229246]: 2025-12-06 10:06:16.149 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:16 np0005548918 nova_compute[229246]: 2025-12-06 10:06:16.149 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.002000054s ======
Dec  6 05:06:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:16.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Dec  6 05:06:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:16 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:16 np0005548918 nova_compute[229246]: 2025-12-06 10:06:16.937 229250 DEBUG nova.network.neutron [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Successfully created port: 9b30064b-3b78-4be4-bd9a-743cb550a78d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  6 05:06:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:17 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa344001f30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:17 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:17.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:18.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:18 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:18 np0005548918 nova_compute[229246]: 2025-12-06 10:06:18.902 229250 DEBUG nova.network.neutron [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Successfully updated port: 9b30064b-3b78-4be4-bd9a-743cb550a78d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  6 05:06:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:18 np0005548918 nova_compute[229246]: 2025-12-06 10:06:18.922 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "refresh_cache-a6718b79-80e4-4d66-bff2-537e642a14f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:06:18 np0005548918 nova_compute[229246]: 2025-12-06 10:06:18.922 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquired lock "refresh_cache-a6718b79-80e4-4d66-bff2-537e642a14f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:06:18 np0005548918 nova_compute[229246]: 2025-12-06 10:06:18.923 229250 DEBUG nova.network.neutron [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  6 05:06:18 np0005548918 nova_compute[229246]: 2025-12-06 10:06:18.993 229250 DEBUG nova.compute.manager [req-da867f9b-00e9-431b-9609-7a64630314c2 req-0adf312d-acf0-48cf-a8ac-774e082447a6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Received event network-changed-9b30064b-3b78-4be4-bd9a-743cb550a78d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:06:18 np0005548918 nova_compute[229246]: 2025-12-06 10:06:18.994 229250 DEBUG nova.compute.manager [req-da867f9b-00e9-431b-9609-7a64630314c2 req-0adf312d-acf0-48cf-a8ac-774e082447a6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Refreshing instance network info cache due to event network-changed-9b30064b-3b78-4be4-bd9a-743cb550a78d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  6 05:06:18 np0005548918 nova_compute[229246]: 2025-12-06 10:06:18.994 229250 DEBUG oslo_concurrency.lockutils [req-da867f9b-00e9-431b-9609-7a64630314c2 req-0adf312d-acf0-48cf-a8ac-774e082447a6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "refresh_cache-a6718b79-80e4-4d66-bff2-537e642a14f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:06:19 np0005548918 nova_compute[229246]: 2025-12-06 10:06:19.189 229250 DEBUG nova.network.neutron [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  6 05:06:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:19 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:19 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa344001f30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:19.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.366 229250 DEBUG nova.network.neutron [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Updating instance_info_cache with network_info: [{"id": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "address": "fa:16:3e:38:c4:70", "network": {"id": "b5acf2ad-ba64-4833-ba3e-fc2228d0500f", "bridge": "br-int", "label": "tempest-network-smoke--179060463", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b30064b-3b", "ovs_interfaceid": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.394 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Releasing lock "refresh_cache-a6718b79-80e4-4d66-bff2-537e642a14f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.394 229250 DEBUG nova.compute.manager [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Instance network_info: |[{"id": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "address": "fa:16:3e:38:c4:70", "network": {"id": "b5acf2ad-ba64-4833-ba3e-fc2228d0500f", "bridge": "br-int", "label": "tempest-network-smoke--179060463", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b30064b-3b", "ovs_interfaceid": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.395 229250 DEBUG oslo_concurrency.lockutils [req-da867f9b-00e9-431b-9609-7a64630314c2 req-0adf312d-acf0-48cf-a8ac-774e082447a6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquired lock "refresh_cache-a6718b79-80e4-4d66-bff2-537e642a14f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.395 229250 DEBUG nova.network.neutron [req-da867f9b-00e9-431b-9609-7a64630314c2 req-0adf312d-acf0-48cf-a8ac-774e082447a6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Refreshing network info cache for port 9b30064b-3b78-4be4-bd9a-743cb550a78d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.398 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Start _get_guest_xml network_info=[{"id": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "address": "fa:16:3e:38:c4:70", "network": {"id": "b5acf2ad-ba64-4833-ba3e-fc2228d0500f", "bridge": "br-int", "label": "tempest-network-smoke--179060463", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b30064b-3b", "ovs_interfaceid": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:04:42Z,direct_url=<?>,disk_format='qcow2',id=9489b8a5-a798-4e26-87f9-59bb1eb2e6fd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3e0ab101ca7547d4a515169a0f2edef3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-06T10:04:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '9489b8a5-a798-4e26-87f9-59bb1eb2e6fd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.403 229250 WARNING nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.407 229250 DEBUG nova.virt.libvirt.host [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.408 229250 DEBUG nova.virt.libvirt.host [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.411 229250 DEBUG nova.virt.libvirt.host [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.411 229250 DEBUG nova.virt.libvirt.host [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.411 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.412 229250 DEBUG nova.virt.hardware [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-06T10:04:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0a252b9c-cc5f-41b2-a8b2-94fcf6e74d22',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:04:42Z,direct_url=<?>,disk_format='qcow2',id=9489b8a5-a798-4e26-87f9-59bb1eb2e6fd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3e0ab101ca7547d4a515169a0f2edef3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-06T10:04:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.412 229250 DEBUG nova.virt.hardware [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.412 229250 DEBUG nova.virt.hardware [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.412 229250 DEBUG nova.virt.hardware [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.413 229250 DEBUG nova.virt.hardware [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.413 229250 DEBUG nova.virt.hardware [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.413 229250 DEBUG nova.virt.hardware [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.413 229250 DEBUG nova.virt.hardware [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.413 229250 DEBUG nova.virt.hardware [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.414 229250 DEBUG nova.virt.hardware [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.414 229250 DEBUG nova.virt.hardware [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.418 229250 DEBUG nova.privsep.utils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.418 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:20.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:20 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa344001f30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  6 05:06:20 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1556517146' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.862 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.895 229250 DEBUG nova.storage.rbd_utils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image a6718b79-80e4-4d66-bff2-537e642a14f9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:06:20 np0005548918 nova_compute[229246]: 2025-12-06 10:06:20.901 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:21 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  6 05:06:21 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3225489691' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  6 05:06:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:21 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:21 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.401 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.403 229250 DEBUG nova.virt.libvirt.vif [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T10:06:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-514803950',display_name='tempest-TestNetworkBasicOps-server-514803950',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-514803950',id=2,image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBANfrbb3vVLiiIIaJGU6ReqTxhWXK99uQbAyPO3sQfSe2idiKGUdXmXt18P0yE+abn4bDTp3JoNmgFWotoifbNcqtioAi5Sf2d/ivSV7j6kZBvD8Bee93HLbcefk+780Cw==',key_name='tempest-TestNetworkBasicOps-1252291136',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='92b402c8d3e2476abc98be42a1e6d34e',ramdisk_id='',reservation_id='r-6sph5n4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1971100882',owner_user_name='tempest-TestNetworkBasicOps-1971100882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:06:13Z,user_data=None,user_id='03615580775245e6ae335ee9d785611f',uuid=a6718b79-80e4-4d66-bff2-537e642a14f9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "address": "fa:16:3e:38:c4:70", "network": {"id": "b5acf2ad-ba64-4833-ba3e-fc2228d0500f", "bridge": "br-int", "label": "tempest-network-smoke--179060463", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b30064b-3b", "ovs_interfaceid": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.404 229250 DEBUG nova.network.os_vif_util [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converting VIF {"id": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "address": "fa:16:3e:38:c4:70", "network": {"id": "b5acf2ad-ba64-4833-ba3e-fc2228d0500f", "bridge": "br-int", "label": "tempest-network-smoke--179060463", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b30064b-3b", "ovs_interfaceid": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.404 229250 DEBUG nova.network.os_vif_util [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:c4:70,bridge_name='br-int',has_traffic_filtering=True,id=9b30064b-3b78-4be4-bd9a-743cb550a78d,network=Network(b5acf2ad-ba64-4833-ba3e-fc2228d0500f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b30064b-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.406 229250 DEBUG nova.objects.instance [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lazy-loading 'pci_devices' on Instance uuid a6718b79-80e4-4d66-bff2-537e642a14f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.421 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] End _get_guest_xml xml=<domain type="kvm">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  <uuid>a6718b79-80e4-4d66-bff2-537e642a14f9</uuid>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  <name>instance-00000002</name>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  <memory>131072</memory>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  <vcpu>1</vcpu>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  <metadata>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <nova:name>tempest-TestNetworkBasicOps-server-514803950</nova:name>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <nova:creationTime>2025-12-06 10:06:20</nova:creationTime>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <nova:flavor name="m1.nano">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <nova:memory>128</nova:memory>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <nova:disk>1</nova:disk>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <nova:swap>0</nova:swap>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <nova:ephemeral>0</nova:ephemeral>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <nova:vcpus>1</nova:vcpus>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      </nova:flavor>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <nova:owner>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <nova:user uuid="03615580775245e6ae335ee9d785611f">tempest-TestNetworkBasicOps-1971100882-project-member</nova:user>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <nova:project uuid="92b402c8d3e2476abc98be42a1e6d34e">tempest-TestNetworkBasicOps-1971100882</nova:project>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      </nova:owner>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <nova:root type="image" uuid="9489b8a5-a798-4e26-87f9-59bb1eb2e6fd"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <nova:ports>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <nova:port uuid="9b30064b-3b78-4be4-bd9a-743cb550a78d">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:          <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        </nova:port>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      </nova:ports>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    </nova:instance>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  </metadata>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  <sysinfo type="smbios">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <system>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <entry name="manufacturer">RDO</entry>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <entry name="product">OpenStack Compute</entry>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <entry name="serial">a6718b79-80e4-4d66-bff2-537e642a14f9</entry>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <entry name="uuid">a6718b79-80e4-4d66-bff2-537e642a14f9</entry>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <entry name="family">Virtual Machine</entry>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    </system>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  </sysinfo>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  <os>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <boot dev="hd"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <smbios mode="sysinfo"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  </os>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  <features>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <acpi/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <apic/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <vmcoreinfo/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  </features>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  <clock offset="utc">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <timer name="pit" tickpolicy="delay"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <timer name="hpet" present="no"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  </clock>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  <cpu mode="host-model" match="exact">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <topology sockets="1" cores="1" threads="1"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  </cpu>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  <devices>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <disk type="network" device="disk">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <driver type="raw" cache="none"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <source protocol="rbd" name="vms/a6718b79-80e4-4d66-bff2-537e642a14f9_disk">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <host name="192.168.122.100" port="6789"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <host name="192.168.122.102" port="6789"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <host name="192.168.122.101" port="6789"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      </source>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <auth username="openstack">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <secret type="ceph" uuid="5ecd3f74-dade-5fc4-92ce-8950ae424258"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      </auth>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <target dev="vda" bus="virtio"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    </disk>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <disk type="network" device="cdrom">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <driver type="raw" cache="none"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <source protocol="rbd" name="vms/a6718b79-80e4-4d66-bff2-537e642a14f9_disk.config">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <host name="192.168.122.100" port="6789"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <host name="192.168.122.102" port="6789"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <host name="192.168.122.101" port="6789"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      </source>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <auth username="openstack">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:        <secret type="ceph" uuid="5ecd3f74-dade-5fc4-92ce-8950ae424258"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      </auth>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <target dev="sda" bus="sata"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    </disk>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <interface type="ethernet">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <mac address="fa:16:3e:38:c4:70"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <model type="virtio"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <driver name="vhost" rx_queue_size="512"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <mtu size="1442"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <target dev="tap9b30064b-3b"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    </interface>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <serial type="pty">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <log file="/var/lib/nova/instances/a6718b79-80e4-4d66-bff2-537e642a14f9/console.log" append="off"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    </serial>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <video>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <model type="virtio"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    </video>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <input type="tablet" bus="usb"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <rng model="virtio">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <backend model="random">/dev/urandom</backend>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    </rng>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <controller type="usb" index="0"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    <memballoon model="virtio">
Dec  6 05:06:21 np0005548918 nova_compute[229246]:      <stats period="10"/>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:    </memballoon>
Dec  6 05:06:21 np0005548918 nova_compute[229246]:  </devices>
Dec  6 05:06:21 np0005548918 nova_compute[229246]: </domain>
Dec  6 05:06:21 np0005548918 nova_compute[229246]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.423 229250 DEBUG nova.compute.manager [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Preparing to wait for external event network-vif-plugged-9b30064b-3b78-4be4-bd9a-743cb550a78d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.424 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.424 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.425 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.425 229250 DEBUG nova.virt.libvirt.vif [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T10:06:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-514803950',display_name='tempest-TestNetworkBasicOps-server-514803950',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-514803950',id=2,image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBANfrbb3vVLiiIIaJGU6ReqTxhWXK99uQbAyPO3sQfSe2idiKGUdXmXt18P0yE+abn4bDTp3JoNmgFWotoifbNcqtioAi5Sf2d/ivSV7j6kZBvD8Bee93HLbcefk+780Cw==',key_name='tempest-TestNetworkBasicOps-1252291136',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='92b402c8d3e2476abc98be42a1e6d34e',ramdisk_id='',reservation_id='r-6sph5n4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1971100882',owner_user_name='tempest-TestNetworkBasicOps-1971100882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:06:13Z,user_data=None,user_id='03615580775245e6ae335ee9d785611f',uuid=a6718b79-80e4-4d66-bff2-537e642a14f9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "address": "fa:16:3e:38:c4:70", "network": {"id": "b5acf2ad-ba64-4833-ba3e-fc2228d0500f", "bridge": "br-int", "label": "tempest-network-smoke--179060463", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b30064b-3b", "ovs_interfaceid": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.426 229250 DEBUG nova.network.os_vif_util [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converting VIF {"id": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "address": "fa:16:3e:38:c4:70", "network": {"id": "b5acf2ad-ba64-4833-ba3e-fc2228d0500f", "bridge": "br-int", "label": "tempest-network-smoke--179060463", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b30064b-3b", "ovs_interfaceid": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.426 229250 DEBUG nova.network.os_vif_util [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:c4:70,bridge_name='br-int',has_traffic_filtering=True,id=9b30064b-3b78-4be4-bd9a-743cb550a78d,network=Network(b5acf2ad-ba64-4833-ba3e-fc2228d0500f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b30064b-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.428 229250 DEBUG os_vif [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:c4:70,bridge_name='br-int',has_traffic_filtering=True,id=9b30064b-3b78-4be4-bd9a-743cb550a78d,network=Network(b5acf2ad-ba64-4833-ba3e-fc2228d0500f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b30064b-3b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.512 229250 DEBUG ovsdbapp.backend.ovs_idl [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.512 229250 DEBUG ovsdbapp.backend.ovs_idl [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.512 229250 DEBUG ovsdbapp.backend.ovs_idl [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.513 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.513 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [POLLOUT] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.513 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.514 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.515 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.517 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.529 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.530 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.530 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  6 05:06:21 np0005548918 nova_compute[229246]: 2025-12-06 10:06:21.531 229250 INFO oslo.privsep.daemon [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpny__vd1y/privsep.sock']#033[00m
Dec  6 05:06:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:21.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.104 229250 DEBUG nova.network.neutron [req-da867f9b-00e9-431b-9609-7a64630314c2 req-0adf312d-acf0-48cf-a8ac-774e082447a6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Updated VIF entry in instance network info cache for port 9b30064b-3b78-4be4-bd9a-743cb550a78d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.104 229250 DEBUG nova.network.neutron [req-da867f9b-00e9-431b-9609-7a64630314c2 req-0adf312d-acf0-48cf-a8ac-774e082447a6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Updating instance_info_cache with network_info: [{"id": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "address": "fa:16:3e:38:c4:70", "network": {"id": "b5acf2ad-ba64-4833-ba3e-fc2228d0500f", "bridge": "br-int", "label": "tempest-network-smoke--179060463", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b30064b-3b", "ovs_interfaceid": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.125 229250 DEBUG oslo_concurrency.lockutils [req-da867f9b-00e9-431b-9609-7a64630314c2 req-0adf312d-acf0-48cf-a8ac-774e082447a6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Releasing lock "refresh_cache-a6718b79-80e4-4d66-bff2-537e642a14f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.277 229250 INFO oslo.privsep.daemon [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.124 233043 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.128 233043 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.130 233043 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.130 233043 INFO oslo.privsep.daemon [-] privsep daemon running as pid 233043#033[00m
Dec  6 05:06:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:22.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.647 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.648 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b30064b-3b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.648 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9b30064b-3b, col_values=(('external_ids', {'iface-id': '9b30064b-3b78-4be4-bd9a-743cb550a78d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:c4:70', 'vm-uuid': 'a6718b79-80e4-4d66-bff2-537e642a14f9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.650 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:22 np0005548918 NetworkManager[48884]: <info>  [1765015582.6513] manager: (tap9b30064b-3b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.653 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.658 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.660 229250 INFO os_vif [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:c4:70,bridge_name='br-int',has_traffic_filtering=True,id=9b30064b-3b78-4be4-bd9a-743cb550a78d,network=Network(b5acf2ad-ba64-4833-ba3e-fc2228d0500f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b30064b-3b')#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.717 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.718 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.718 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] No VIF found with MAC fa:16:3e:38:c4:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.719 229250 INFO nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Using config drive#033[00m
Dec  6 05:06:22 np0005548918 nova_compute[229246]: 2025-12-06 10:06:22.752 229250 DEBUG nova.storage.rbd_utils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image a6718b79-80e4-4d66-bff2-537e642a14f9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:06:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:22 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa344001f30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:23 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa344001f30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:23 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:23 np0005548918 nova_compute[229246]: 2025-12-06 10:06:23.921 229250 INFO nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Creating config drive at /var/lib/nova/instances/a6718b79-80e4-4d66-bff2-537e642a14f9/disk.config#033[00m
Dec  6 05:06:23 np0005548918 nova_compute[229246]: 2025-12-06 10:06:23.927 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a6718b79-80e4-4d66-bff2-537e642a14f9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa6u478pm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:23.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:23 np0005548918 nova_compute[229246]: 2025-12-06 10:06:23.953 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.063 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a6718b79-80e4-4d66-bff2-537e642a14f9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa6u478pm" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.095 229250 DEBUG nova.storage.rbd_utils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image a6718b79-80e4-4d66-bff2-537e642a14f9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.100 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a6718b79-80e4-4d66-bff2-537e642a14f9/disk.config a6718b79-80e4-4d66-bff2-537e642a14f9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.260 229250 DEBUG oslo_concurrency.processutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a6718b79-80e4-4d66-bff2-537e642a14f9/disk.config a6718b79-80e4-4d66-bff2-537e642a14f9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.159s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.261 229250 INFO nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Deleting local config drive /var/lib/nova/instances/a6718b79-80e4-4d66-bff2-537e642a14f9/disk.config because it was imported into RBD.#033[00m
Dec  6 05:06:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:24 np0005548918 systemd[1]: Starting libvirt secret daemon...
Dec  6 05:06:24 np0005548918 systemd[1]: Started libvirt secret daemon.
Dec  6 05:06:24 np0005548918 kernel: tun: Universal TUN/TAP device driver, 1.6
Dec  6 05:06:24 np0005548918 kernel: tap9b30064b-3b: entered promiscuous mode
Dec  6 05:06:24 np0005548918 NetworkManager[48884]: <info>  [1765015584.3873] manager: (tap9b30064b-3b): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Dec  6 05:06:24 np0005548918 ovn_controller[132371]: 2025-12-06T10:06:24Z|00033|binding|INFO|Claiming lport 9b30064b-3b78-4be4-bd9a-743cb550a78d for this chassis.
Dec  6 05:06:24 np0005548918 ovn_controller[132371]: 2025-12-06T10:06:24Z|00034|binding|INFO|9b30064b-3b78-4be4-bd9a-743cb550a78d: Claiming fa:16:3e:38:c4:70 10.100.0.27
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.391 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:24 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:24.403 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:c4:70 10.100.0.27'], port_security=['fa:16:3e:38:c4:70 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': 'a6718b79-80e4-4d66-bff2-537e642a14f9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5acf2ad-ba64-4833-ba3e-fc2228d0500f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92b402c8d3e2476abc98be42a1e6d34e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8306047b-b173-44bd-a997-036bd27c9520', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6dc6f73b-6456-426e-b8ba-888fa95f6d1c, chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], logical_port=9b30064b-3b78-4be4-bd9a-743cb550a78d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:06:24 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:24.404 141640 INFO neutron.agent.ovn.metadata.agent [-] Port 9b30064b-3b78-4be4-bd9a-743cb550a78d in datapath b5acf2ad-ba64-4833-ba3e-fc2228d0500f bound to our chassis#033[00m
Dec  6 05:06:24 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:24.406 141640 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5acf2ad-ba64-4833-ba3e-fc2228d0500f#033[00m
Dec  6 05:06:24 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:24.407 141640 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmps4gsrbfx/privsep.sock']#033[00m
Dec  6 05:06:24 np0005548918 systemd-udevd[233144]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 05:06:24 np0005548918 NetworkManager[48884]: <info>  [1765015584.4426] device (tap9b30064b-3b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  6 05:06:24 np0005548918 NetworkManager[48884]: <info>  [1765015584.4435] device (tap9b30064b-3b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  6 05:06:24 np0005548918 systemd-machined[192688]: New machine qemu-1-instance-00000002.
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.467 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:24 np0005548918 ovn_controller[132371]: 2025-12-06T10:06:24Z|00035|binding|INFO|Setting lport 9b30064b-3b78-4be4-bd9a-743cb550a78d ovn-installed in OVS
Dec  6 05:06:24 np0005548918 ovn_controller[132371]: 2025-12-06T10:06:24Z|00036|binding|INFO|Setting lport 9b30064b-3b78-4be4-bd9a-743cb550a78d up in Southbound
Dec  6 05:06:24 np0005548918 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Dec  6 05:06:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.473 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:24.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:24 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.909 229250 DEBUG nova.virt.driver [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Emitting event <LifecycleEvent: 1765015584.9083276, a6718b79-80e4-4d66-bff2-537e642a14f9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.909 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] VM Started (Lifecycle Event)#033[00m
Dec  6 05:06:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.929 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.935 229250 DEBUG nova.virt.driver [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Emitting event <LifecycleEvent: 1765015584.9085522, a6718b79-80e4-4d66-bff2-537e642a14f9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.935 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] VM Paused (Lifecycle Event)#033[00m
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.954 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.958 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  6 05:06:24 np0005548918 nova_compute[229246]: 2025-12-06 10:06:24.985 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  6 05:06:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.054 229250 DEBUG nova.compute.manager [req-dc2d9aa5-7a6e-4a65-821b-55b436399900 req-8926dc30-f221-4986-b15e-c45094e2eb6f d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Received event network-vif-plugged-9b30064b-3b78-4be4-bd9a-743cb550a78d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.055 229250 DEBUG oslo_concurrency.lockutils [req-dc2d9aa5-7a6e-4a65-821b-55b436399900 req-8926dc30-f221-4986-b15e-c45094e2eb6f d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.055 229250 DEBUG oslo_concurrency.lockutils [req-dc2d9aa5-7a6e-4a65-821b-55b436399900 req-8926dc30-f221-4986-b15e-c45094e2eb6f d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.055 229250 DEBUG oslo_concurrency.lockutils [req-dc2d9aa5-7a6e-4a65-821b-55b436399900 req-8926dc30-f221-4986-b15e-c45094e2eb6f d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.055 229250 DEBUG nova.compute.manager [req-dc2d9aa5-7a6e-4a65-821b-55b436399900 req-8926dc30-f221-4986-b15e-c45094e2eb6f d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Processing event network-vif-plugged-9b30064b-3b78-4be4-bd9a-743cb550a78d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.056 229250 DEBUG nova.compute.manager [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.060 229250 DEBUG nova.virt.driver [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Emitting event <LifecycleEvent: 1765015585.0600657, a6718b79-80e4-4d66-bff2-537e642a14f9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.060 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] VM Resumed (Lifecycle Event)#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.074 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.077 229250 INFO nova.virt.libvirt.driver [-] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Instance spawned successfully.#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.078 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.081 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.085 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.104 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.104 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.105 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.105 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.106 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.106 229250 DEBUG nova.virt.libvirt.driver [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.110 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  6 05:06:25 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:25.163 141640 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  6 05:06:25 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:25.164 141640 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmps4gsrbfx/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  6 05:06:25 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:25.013 233203 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  6 05:06:25 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:25.016 233203 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  6 05:06:25 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:25.018 233203 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Dec  6 05:06:25 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:25.019 233203 INFO oslo.privsep.daemon [-] privsep daemon running as pid 233203#033[00m
Dec  6 05:06:25 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:25.166 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[8bd7b053-d9f4-4cac-baee-a46b3a713a5c]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.169 229250 INFO nova.compute.manager [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Took 11.29 seconds to spawn the instance on the hypervisor.#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.170 229250 DEBUG nova.compute.manager [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.234 229250 INFO nova.compute.manager [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Took 12.22 seconds to build instance.#033[00m
Dec  6 05:06:25 np0005548918 nova_compute[229246]: 2025-12-06 10:06:25.254 229250 DEBUG oslo_concurrency.lockutils [None req-d7df5e8f-9729-4a7c-b49a-96440011a409 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:25 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:25 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:25.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:25 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:25.970 233203 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:25 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:25.971 233203 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:25 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:25.971 233203 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:26.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:26 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:26.713 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[646bc780-bba8-4ff0-bd99-28d64899b8ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:26 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:26.714 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb5acf2ad-b1 in ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  6 05:06:26 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:26.715 233203 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb5acf2ad-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  6 05:06:26 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:26.716 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[d72c59d9-ea49-4263-8782-2c4a6ce17ce4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:26 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:26.718 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[9d8c9294-8378-47f5-9275-1588b5cb1d0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:26 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:26.742 141754 DEBUG oslo.privsep.daemon [-] privsep: reply[51d263c5-4ea3-4b27-baf9-becb83580d27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:26 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:26 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:26.771 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[0a9bfdfc-c0da-4f32-9540-de771707da1d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:26 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:26.773 141640 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpcw_j7qb8/privsep.sock']#033[00m
Dec  6 05:06:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:27 np0005548918 nova_compute[229246]: 2025-12-06 10:06:27.153 229250 DEBUG nova.compute.manager [req-9dbce1e3-9f49-4df3-84d8-9bba00d654d6 req-183969fb-0b0f-4667-a8ca-bd1858653ab0 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Received event network-vif-plugged-9b30064b-3b78-4be4-bd9a-743cb550a78d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:06:27 np0005548918 nova_compute[229246]: 2025-12-06 10:06:27.154 229250 DEBUG oslo_concurrency.lockutils [req-9dbce1e3-9f49-4df3-84d8-9bba00d654d6 req-183969fb-0b0f-4667-a8ca-bd1858653ab0 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:27 np0005548918 nova_compute[229246]: 2025-12-06 10:06:27.155 229250 DEBUG oslo_concurrency.lockutils [req-9dbce1e3-9f49-4df3-84d8-9bba00d654d6 req-183969fb-0b0f-4667-a8ca-bd1858653ab0 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:27 np0005548918 nova_compute[229246]: 2025-12-06 10:06:27.155 229250 DEBUG oslo_concurrency.lockutils [req-9dbce1e3-9f49-4df3-84d8-9bba00d654d6 req-183969fb-0b0f-4667-a8ca-bd1858653ab0 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:27 np0005548918 nova_compute[229246]: 2025-12-06 10:06:27.155 229250 DEBUG nova.compute.manager [req-9dbce1e3-9f49-4df3-84d8-9bba00d654d6 req-183969fb-0b0f-4667-a8ca-bd1858653ab0 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] No waiting events found dispatching network-vif-plugged-9b30064b-3b78-4be4-bd9a-743cb550a78d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:06:27 np0005548918 nova_compute[229246]: 2025-12-06 10:06:27.156 229250 WARNING nova.compute.manager [req-9dbce1e3-9f49-4df3-84d8-9bba00d654d6 req-183969fb-0b0f-4667-a8ca-bd1858653ab0 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Received unexpected event network-vif-plugged-9b30064b-3b78-4be4-bd9a-743cb550a78d for instance with vm_state active and task_state None.#033[00m
Dec  6 05:06:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:27 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:27 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:27.465 141640 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  6 05:06:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:27.467 141640 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpcw_j7qb8/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  6 05:06:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:27.338 233220 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  6 05:06:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:27.342 233220 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  6 05:06:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:27.344 233220 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Dec  6 05:06:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:27.344 233220 INFO oslo.privsep.daemon [-] privsep daemon running as pid 233220#033[00m
Dec  6 05:06:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:27.469 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[9fac3855-1921-4883-8ddc-aeb767620cf3]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:27 np0005548918 nova_compute[229246]: 2025-12-06 10:06:27.651 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:27.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:27.972 233220 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:27.972 233220 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:27.972 233220 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:28.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.553 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[7ae47d3d-532c-494f-b6b7-f0c201fcf564]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:28 np0005548918 NetworkManager[48884]: <info>  [1765015588.5724] manager: (tapb5acf2ad-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.574 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[badf06d5-1449-4a81-a9a1-fd0549e0216b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:28 np0005548918 systemd-udevd[233233]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.605 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[64f08b98-f36b-4e49-9d26-ade307cd573e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.610 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[67bff164-eb22-4488-8361-523d6a4745cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:28 np0005548918 NetworkManager[48884]: <info>  [1765015588.6318] device (tapb5acf2ad-b0): carrier: link connected
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.637 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[785f70df-ebec-440c-b7ed-44a307a21352]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.653 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[9e73f3af-7d7a-4501-ae44-1eff39a202b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5acf2ad-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:54:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396223, 'reachable_time': 30793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233251, 'error': None, 'target': 'ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.668 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[e2929331-a081-4930-939d-e0972dc06503]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed6:54df'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396223, 'tstamp': 396223}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233252, 'error': None, 'target': 'ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.683 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d9bd9b-93f6-4ccb-86b6-1ee0d43533c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5acf2ad-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:54:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396223, 'reachable_time': 30793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233253, 'error': None, 'target': 'ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.710 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[5ea674a0-a544-452d-9d65-d4a800b3a4c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.758 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[73605997-b4b3-4d7e-a09c-1fe728b888ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.760 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5acf2ad-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.760 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.761 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5acf2ad-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:06:28 np0005548918 nova_compute[229246]: 2025-12-06 10:06:28.763 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:28 np0005548918 kernel: tapb5acf2ad-b0: entered promiscuous mode
Dec  6 05:06:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:28 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:28 np0005548918 NetworkManager[48884]: <info>  [1765015588.7637] manager: (tapb5acf2ad-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.768 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5acf2ad-b0, col_values=(('external_ids', {'iface-id': '7d6f5b43-9a57-499b-ab0f-37616bc6af78'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:06:28 np0005548918 ovn_controller[132371]: 2025-12-06T10:06:28Z|00037|binding|INFO|Releasing lport 7d6f5b43-9a57-499b-ab0f-37616bc6af78 from this chassis (sb_readonly=0)
Dec  6 05:06:28 np0005548918 nova_compute[229246]: 2025-12-06 10:06:28.770 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.771 141640 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5acf2ad-ba64-4833-ba3e-fc2228d0500f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5acf2ad-ba64-4833-ba3e-fc2228d0500f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.774 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[06285f85-70ce-45aa-a35d-8f8a76063b09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.775 141640 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: global
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    log         /dev/log local0 debug
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    log-tag     haproxy-metadata-proxy-b5acf2ad-ba64-4833-ba3e-fc2228d0500f
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    user        root
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    group       root
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    maxconn     1024
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    pidfile     /var/lib/neutron/external/pids/b5acf2ad-ba64-4833-ba3e-fc2228d0500f.pid.haproxy
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    daemon
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: defaults
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    log global
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    mode http
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    option httplog
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    option dontlognull
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    option http-server-close
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    option forwardfor
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    retries                 3
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    timeout http-request    30s
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    timeout connect         30s
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    timeout client          32s
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    timeout server          32s
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    timeout http-keep-alive 30s
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: listen listener
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    bind 169.254.169.254:80
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    server metadata /var/lib/neutron/metadata_proxy
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]:    http-request add-header X-OVN-Network-ID b5acf2ad-ba64-4833-ba3e-fc2228d0500f
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  6 05:06:28 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:28.776 141640 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f', 'env', 'PROCESS_TAG=haproxy-b5acf2ad-ba64-4833-ba3e-fc2228d0500f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b5acf2ad-ba64-4833-ba3e-fc2228d0500f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  6 05:06:28 np0005548918 nova_compute[229246]: 2025-12-06 10:06:28.782 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:28 np0005548918 nova_compute[229246]: 2025-12-06 10:06:28.955 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:29 np0005548918 podman[233287]: 2025-12-06 10:06:29.129892766 +0000 UTC m=+0.043556015 container create 681ae0b8a1f4a9e45481133051bc5666b41742733e40322a30e20c1400cf1a27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:06:29 np0005548918 systemd[1]: Started libpod-conmon-681ae0b8a1f4a9e45481133051bc5666b41742733e40322a30e20c1400cf1a27.scope.
Dec  6 05:06:29 np0005548918 systemd[1]: Started libcrun container.
Dec  6 05:06:29 np0005548918 podman[233287]: 2025-12-06 10:06:29.105549059 +0000 UTC m=+0.019212318 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3
Dec  6 05:06:29 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1c30beafdd07a7f3f097cbb519d81eab224784b23e20b09c6d9ca6f03c9460d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  6 05:06:29 np0005548918 podman[233287]: 2025-12-06 10:06:29.213389544 +0000 UTC m=+0.127052823 container init 681ae0b8a1f4a9e45481133051bc5666b41742733e40322a30e20c1400cf1a27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:06:29 np0005548918 podman[233287]: 2025-12-06 10:06:29.218566975 +0000 UTC m=+0.132230224 container start 681ae0b8a1f4a9e45481133051bc5666b41742733e40322a30e20c1400cf1a27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:06:29 np0005548918 neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f[233303]: [NOTICE]   (233307) : New worker (233309) forked
Dec  6 05:06:29 np0005548918 neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f[233303]: [NOTICE]   (233307) : Loading success.
Dec  6 05:06:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:29 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:29 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:29.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:30.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:30 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:31 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:31 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:31.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:32.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:32 np0005548918 nova_compute[229246]: 2025-12-06 10:06:32.719 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:32 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa340000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:33 np0005548918 podman[233348]: 2025-12-06 10:06:33.19965396 +0000 UTC m=+0.077686110 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125)
Dec  6 05:06:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:33 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:33 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:33 np0005548918 NetworkManager[48884]: <info>  [1765015593.4887] manager: (patch-br-int-to-provnet-c81e973e-7ff9-4cd2-9994-daf87649321f): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/27)
Dec  6 05:06:33 np0005548918 NetworkManager[48884]: <info>  [1765015593.4892] device (patch-br-int-to-provnet-c81e973e-7ff9-4cd2-9994-daf87649321f)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 05:06:33 np0005548918 nova_compute[229246]: 2025-12-06 10:06:33.488 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:33 np0005548918 NetworkManager[48884]: <info>  [1765015593.4903] manager: (patch-provnet-c81e973e-7ff9-4cd2-9994-daf87649321f-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/28)
Dec  6 05:06:33 np0005548918 NetworkManager[48884]: <info>  [1765015593.4906] device (patch-provnet-c81e973e-7ff9-4cd2-9994-daf87649321f-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  6 05:06:33 np0005548918 NetworkManager[48884]: <info>  [1765015593.4915] manager: (patch-provnet-c81e973e-7ff9-4cd2-9994-daf87649321f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Dec  6 05:06:33 np0005548918 NetworkManager[48884]: <info>  [1765015593.4920] manager: (patch-br-int-to-provnet-c81e973e-7ff9-4cd2-9994-daf87649321f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Dec  6 05:06:33 np0005548918 NetworkManager[48884]: <info>  [1765015593.4928] device (patch-br-int-to-provnet-c81e973e-7ff9-4cd2-9994-daf87649321f)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec  6 05:06:33 np0005548918 NetworkManager[48884]: <info>  [1765015593.4931] device (patch-provnet-c81e973e-7ff9-4cd2-9994-daf87649321f-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec  6 05:06:33 np0005548918 nova_compute[229246]: 2025-12-06 10:06:33.563 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:33 np0005548918 ovn_controller[132371]: 2025-12-06T10:06:33Z|00038|binding|INFO|Releasing lport 7d6f5b43-9a57-499b-ab0f-37616bc6af78 from this chassis (sb_readonly=0)
Dec  6 05:06:33 np0005548918 nova_compute[229246]: 2025-12-06 10:06:33.570 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:33.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:33 np0005548918 nova_compute[229246]: 2025-12-06 10:06:33.958 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:34.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:34 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:35 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:35.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:36.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:36 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:37 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:37 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:37 np0005548918 nova_compute[229246]: 2025-12-06 10:06:37.721 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:37.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:38 np0005548918 ovn_controller[132371]: 2025-12-06T10:06:38Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:38:c4:70 10.100.0.27
Dec  6 05:06:38 np0005548918 ovn_controller[132371]: 2025-12-06T10:06:38Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:38:c4:70 10.100.0.27
Dec  6 05:06:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:38.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:38 np0005548918 nova_compute[229246]: 2025-12-06 10:06:38.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:38 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:38 np0005548918 nova_compute[229246]: 2025-12-06 10:06:38.960 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:39 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:39 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:39 np0005548918 nova_compute[229246]: 2025-12-06 10:06:39.530 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:39.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100640 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:06:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:40.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:40 np0005548918 nova_compute[229246]: 2025-12-06 10:06:40.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:40 np0005548918 nova_compute[229246]: 2025-12-06 10:06:40.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:40 np0005548918 nova_compute[229246]: 2025-12-06 10:06:40.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:06:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:40 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:41 np0005548918 podman[233385]: 2025-12-06 10:06:41.181062884 +0000 UTC m=+0.066319979 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:06:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:41 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:41 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:41 np0005548918 nova_compute[229246]: 2025-12-06 10:06:41.530 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:41 np0005548918 nova_compute[229246]: 2025-12-06 10:06:41.548 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:41 np0005548918 nova_compute[229246]: 2025-12-06 10:06:41.549 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:06:41 np0005548918 nova_compute[229246]: 2025-12-06 10:06:41.549 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:06:41 np0005548918 nova_compute[229246]: 2025-12-06 10:06:41.888 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "refresh_cache-a6718b79-80e4-4d66-bff2-537e642a14f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:06:41 np0005548918 nova_compute[229246]: 2025-12-06 10:06:41.888 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquired lock "refresh_cache-a6718b79-80e4-4d66-bff2-537e642a14f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:06:41 np0005548918 nova_compute[229246]: 2025-12-06 10:06:41.888 229250 DEBUG nova.network.neutron [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  6 05:06:41 np0005548918 nova_compute[229246]: 2025-12-06 10:06:41.889 229250 DEBUG nova.objects.instance [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a6718b79-80e4-4d66-bff2-537e642a14f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  6 05:06:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:41.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:42.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:42 np0005548918 nova_compute[229246]: 2025-12-06 10:06:42.724 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:42 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:43 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa340002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:43 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:43 np0005548918 nova_compute[229246]: 2025-12-06 10:06:43.539 229250 DEBUG nova.network.neutron [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Updating instance_info_cache with network_info: [{"id": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "address": "fa:16:3e:38:c4:70", "network": {"id": "b5acf2ad-ba64-4833-ba3e-fc2228d0500f", "bridge": "br-int", "label": "tempest-network-smoke--179060463", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b30064b-3b", "ovs_interfaceid": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:06:43 np0005548918 nova_compute[229246]: 2025-12-06 10:06:43.556 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Releasing lock "refresh_cache-a6718b79-80e4-4d66-bff2-537e642a14f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:06:43 np0005548918 nova_compute[229246]: 2025-12-06 10:06:43.556 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  6 05:06:43 np0005548918 nova_compute[229246]: 2025-12-06 10:06:43.557 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:43 np0005548918 nova_compute[229246]: 2025-12-06 10:06:43.557 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:43 np0005548918 nova_compute[229246]: 2025-12-06 10:06:43.557 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:43 np0005548918 nova_compute[229246]: 2025-12-06 10:06:43.557 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:06:43 np0005548918 nova_compute[229246]: 2025-12-06 10:06:43.575 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:43 np0005548918 nova_compute[229246]: 2025-12-06 10:06:43.576 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:43 np0005548918 nova_compute[229246]: 2025-12-06 10:06:43.576 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:43 np0005548918 nova_compute[229246]: 2025-12-06 10:06:43.576 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:06:43 np0005548918 nova_compute[229246]: 2025-12-06 10:06:43.577 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:43 np0005548918 nova_compute[229246]: 2025-12-06 10:06:43.962 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:43.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:06:44 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3034468691' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.020 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.094 229250 DEBUG nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.095 229250 DEBUG nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Dec  6 05:06:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.299 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.301 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4728MB free_disk=59.897621154785156GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.301 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.302 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.375 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Instance a6718b79-80e4-4d66-bff2-537e642a14f9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.375 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.375 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.461 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:44.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:44 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:06:44 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/490054839' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:06:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.930 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.937 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating inventory in ProviderTree for provider 31f5f484-bf36-44de-83b8-7b434061a77b with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.982 229250 ERROR nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] [req-d8173051-9300-4c05-9552-ae3246bccd45] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 31f5f484-bf36-44de-83b8-7b434061a77b.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-d8173051-9300-4c05-9552-ae3246bccd45"}]}#033[00m
Dec  6 05:06:44 np0005548918 nova_compute[229246]: 2025-12-06 10:06:44.997 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing inventories for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  6 05:06:45 np0005548918 nova_compute[229246]: 2025-12-06 10:06:45.012 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating ProviderTree inventory for provider 31f5f484-bf36-44de-83b8-7b434061a77b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  6 05:06:45 np0005548918 nova_compute[229246]: 2025-12-06 10:06:45.012 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating inventory in ProviderTree for provider 31f5f484-bf36-44de-83b8-7b434061a77b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  6 05:06:45 np0005548918 nova_compute[229246]: 2025-12-06 10:06:45.027 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing aggregate associations for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  6 05:06:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:45 np0005548918 nova_compute[229246]: 2025-12-06 10:06:45.075 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing trait associations for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE4A,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_ABM,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,HW_CPU_X86_AVX2,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  6 05:06:45 np0005548918 nova_compute[229246]: 2025-12-06 10:06:45.129 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:45 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:45 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa340002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:06:45 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2091989865' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:06:45 np0005548918 nova_compute[229246]: 2025-12-06 10:06:45.903 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.774s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:45 np0005548918 nova_compute[229246]: 2025-12-06 10:06:45.912 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating inventory in ProviderTree for provider 31f5f484-bf36-44de-83b8-7b434061a77b with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  6 05:06:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:45.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:45 np0005548918 nova_compute[229246]: 2025-12-06 10:06:45.973 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updated inventory for provider 31f5f484-bf36-44de-83b8-7b434061a77b with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Dec  6 05:06:45 np0005548918 nova_compute[229246]: 2025-12-06 10:06:45.973 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating resource provider 31f5f484-bf36-44de-83b8-7b434061a77b generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  6 05:06:45 np0005548918 nova_compute[229246]: 2025-12-06 10:06:45.974 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating inventory in ProviderTree for provider 31f5f484-bf36-44de-83b8-7b434061a77b with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  6 05:06:46 np0005548918 nova_compute[229246]: 2025-12-06 10:06:46.006 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:06:46 np0005548918 nova_compute[229246]: 2025-12-06 10:06:46.007 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.705s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:46 np0005548918 nova_compute[229246]: 2025-12-06 10:06:46.101 229250 DEBUG oslo_concurrency.lockutils [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "a6718b79-80e4-4d66-bff2-537e642a14f9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:46 np0005548918 nova_compute[229246]: 2025-12-06 10:06:46.101 229250 DEBUG oslo_concurrency.lockutils [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:46 np0005548918 nova_compute[229246]: 2025-12-06 10:06:46.102 229250 DEBUG oslo_concurrency.lockutils [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:46 np0005548918 nova_compute[229246]: 2025-12-06 10:06:46.102 229250 DEBUG oslo_concurrency.lockutils [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:46 np0005548918 nova_compute[229246]: 2025-12-06 10:06:46.103 229250 DEBUG oslo_concurrency.lockutils [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:46 np0005548918 nova_compute[229246]: 2025-12-06 10:06:46.104 229250 INFO nova.compute.manager [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Terminating instance#033[00m
Dec  6 05:06:46 np0005548918 nova_compute[229246]: 2025-12-06 10:06:46.105 229250 DEBUG nova.compute.manager [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  6 05:06:46 np0005548918 podman[233477]: 2025-12-06 10:06:46.201088203 +0000 UTC m=+0.077964587 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  6 05:06:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:46.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:46 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa3580042a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:47 np0005548918 kernel: tap9b30064b-3b (unregistering): left promiscuous mode
Dec  6 05:06:47 np0005548918 NetworkManager[48884]: <info>  [1765015607.3974] device (tap9b30064b-3b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  6 05:06:47 np0005548918 ovn_controller[132371]: 2025-12-06T10:06:47Z|00039|binding|INFO|Releasing lport 9b30064b-3b78-4be4-bd9a-743cb550a78d from this chassis (sb_readonly=0)
Dec  6 05:06:47 np0005548918 ovn_controller[132371]: 2025-12-06T10:06:47Z|00040|binding|INFO|Setting lport 9b30064b-3b78-4be4-bd9a-743cb550a78d down in Southbound
Dec  6 05:06:47 np0005548918 ovn_controller[132371]: 2025-12-06T10:06:47Z|00041|binding|INFO|Removing iface tap9b30064b-3b ovn-installed in OVS
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.404 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.405 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:47 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa34c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.422 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:47 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa33c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:06:47 np0005548918 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec  6 05:06:47 np0005548918 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 13.478s CPU time.
Dec  6 05:06:47 np0005548918 systemd-machined[192688]: Machine qemu-1-instance-00000002 terminated.
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.553 229250 INFO nova.virt.libvirt.driver [-] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Instance destroyed successfully.#033[00m
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.553 229250 DEBUG nova.objects.instance [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lazy-loading 'resources' on Instance uuid a6718b79-80e4-4d66-bff2-537e642a14f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.763 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:47 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:47.775 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:c4:70 10.100.0.27'], port_security=['fa:16:3e:38:c4:70 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': 'a6718b79-80e4-4d66-bff2-537e642a14f9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5acf2ad-ba64-4833-ba3e-fc2228d0500f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92b402c8d3e2476abc98be42a1e6d34e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8306047b-b173-44bd-a997-036bd27c9520', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6dc6f73b-6456-426e-b8ba-888fa95f6d1c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], logical_port=9b30064b-3b78-4be4-bd9a-743cb550a78d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:06:47 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:47.778 141640 INFO neutron.agent.ovn.metadata.agent [-] Port 9b30064b-3b78-4be4-bd9a-743cb550a78d in datapath b5acf2ad-ba64-4833-ba3e-fc2228d0500f unbound from our chassis#033[00m
Dec  6 05:06:47 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:47.780 141640 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5acf2ad-ba64-4833-ba3e-fc2228d0500f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  6 05:06:47 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:47.781 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[266f0f9c-236e-4d54-9ad8-e65301596a41]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:47 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:47.782 141640 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f namespace which is not needed anymore#033[00m
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.788 229250 DEBUG nova.virt.libvirt.vif [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-06T10:06:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-514803950',display_name='tempest-TestNetworkBasicOps-server-514803950',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-514803950',id=2,image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBANfrbb3vVLiiIIaJGU6ReqTxhWXK99uQbAyPO3sQfSe2idiKGUdXmXt18P0yE+abn4bDTp3JoNmgFWotoifbNcqtioAi5Sf2d/ivSV7j6kZBvD8Bee93HLbcefk+780Cw==',key_name='tempest-TestNetworkBasicOps-1252291136',keypairs=<?>,launch_index=0,launched_at=2025-12-06T10:06:25Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='92b402c8d3e2476abc98be42a1e6d34e',ramdisk_id='',reservation_id='r-6sph5n4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1971100882',owner_user_name='tempest-TestNetworkBasicOps-1971100882-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T10:06:25Z,user_data=None,user_id='03615580775245e6ae335ee9d785611f',uuid=a6718b79-80e4-4d66-bff2-537e642a14f9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "address": "fa:16:3e:38:c4:70", "network": {"id": "b5acf2ad-ba64-4833-ba3e-fc2228d0500f", "bridge": "br-int", "label": "tempest-network-smoke--179060463", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b30064b-3b", "ovs_interfaceid": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.789 229250 DEBUG nova.network.os_vif_util [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converting VIF {"id": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "address": "fa:16:3e:38:c4:70", "network": {"id": "b5acf2ad-ba64-4833-ba3e-fc2228d0500f", "bridge": "br-int", "label": "tempest-network-smoke--179060463", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b30064b-3b", "ovs_interfaceid": "9b30064b-3b78-4be4-bd9a-743cb550a78d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.790 229250 DEBUG nova.network.os_vif_util [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:38:c4:70,bridge_name='br-int',has_traffic_filtering=True,id=9b30064b-3b78-4be4-bd9a-743cb550a78d,network=Network(b5acf2ad-ba64-4833-ba3e-fc2228d0500f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b30064b-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.790 229250 DEBUG os_vif [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:c4:70,bridge_name='br-int',has_traffic_filtering=True,id=9b30064b-3b78-4be4-bd9a-743cb550a78d,network=Network(b5acf2ad-ba64-4833-ba3e-fc2228d0500f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b30064b-3b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.792 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.793 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b30064b-3b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.794 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.796 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:47 np0005548918 nova_compute[229246]: 2025-12-06 10:06:47.799 229250 INFO os_vif [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:c4:70,bridge_name='br-int',has_traffic_filtering=True,id=9b30064b-3b78-4be4-bd9a-743cb550a78d,network=Network(b5acf2ad-ba64-4833-ba3e-fc2228d0500f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b30064b-3b')#033[00m
Dec  6 05:06:47 np0005548918 neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f[233303]: [NOTICE]   (233307) : haproxy version is 2.8.14-c23fe91
Dec  6 05:06:47 np0005548918 neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f[233303]: [NOTICE]   (233307) : path to executable is /usr/sbin/haproxy
Dec  6 05:06:47 np0005548918 neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f[233303]: [WARNING]  (233307) : Exiting Master process...
Dec  6 05:06:47 np0005548918 neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f[233303]: [ALERT]    (233307) : Current worker (233309) exited with code 143 (Terminated)
Dec  6 05:06:47 np0005548918 neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f[233303]: [WARNING]  (233307) : All workers exited. Exiting... (0)
Dec  6 05:06:47 np0005548918 systemd[1]: libpod-681ae0b8a1f4a9e45481133051bc5666b41742733e40322a30e20c1400cf1a27.scope: Deactivated successfully.
Dec  6 05:06:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:47 np0005548918 podman[233551]: 2025-12-06 10:06:47.928114514 +0000 UTC m=+0.047957155 container died 681ae0b8a1f4a9e45481133051bc5666b41742733e40322a30e20c1400cf1a27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 05:06:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:47.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:47 np0005548918 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-681ae0b8a1f4a9e45481133051bc5666b41742733e40322a30e20c1400cf1a27-userdata-shm.mount: Deactivated successfully.
Dec  6 05:06:47 np0005548918 systemd[1]: var-lib-containers-storage-overlay-a1c30beafdd07a7f3f097cbb519d81eab224784b23e20b09c6d9ca6f03c9460d-merged.mount: Deactivated successfully.
Dec  6 05:06:47 np0005548918 podman[233551]: 2025-12-06 10:06:47.999635774 +0000 UTC m=+0.119478415 container cleanup 681ae0b8a1f4a9e45481133051bc5666b41742733e40322a30e20c1400cf1a27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  6 05:06:48 np0005548918 systemd[1]: libpod-conmon-681ae0b8a1f4a9e45481133051bc5666b41742733e40322a30e20c1400cf1a27.scope: Deactivated successfully.
Dec  6 05:06:48 np0005548918 podman[233582]: 2025-12-06 10:06:48.05606013 +0000 UTC m=+0.036422749 container remove 681ae0b8a1f4a9e45481133051bc5666b41742733e40322a30e20c1400cf1a27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec  6 05:06:48 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:48.063 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[a51862fd-a869-4dec-a429-fd4f7158a3ab]: (4, ('Sat Dec  6 10:06:47 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f (681ae0b8a1f4a9e45481133051bc5666b41742733e40322a30e20c1400cf1a27)\n681ae0b8a1f4a9e45481133051bc5666b41742733e40322a30e20c1400cf1a27\nSat Dec  6 10:06:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f (681ae0b8a1f4a9e45481133051bc5666b41742733e40322a30e20c1400cf1a27)\n681ae0b8a1f4a9e45481133051bc5666b41742733e40322a30e20c1400cf1a27\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:48 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:48.065 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[19ff1a1b-f9d1-48aa-8d73-f920c567f13f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:48 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:48.065 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5acf2ad-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.067 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:48 np0005548918 kernel: tapb5acf2ad-b0: left promiscuous mode
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.081 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:48 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:48.083 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[4ce756cc-18f6-4fd7-bac9-0fc7001eeba4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.089 229250 DEBUG nova.compute.manager [req-f7780de6-13e5-48fa-a517-9ff03daeee07 req-c97752c2-df91-41a8-807d-bfba960b7519 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Received event network-vif-unplugged-9b30064b-3b78-4be4-bd9a-743cb550a78d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.090 229250 DEBUG oslo_concurrency.lockutils [req-f7780de6-13e5-48fa-a517-9ff03daeee07 req-c97752c2-df91-41a8-807d-bfba960b7519 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.090 229250 DEBUG oslo_concurrency.lockutils [req-f7780de6-13e5-48fa-a517-9ff03daeee07 req-c97752c2-df91-41a8-807d-bfba960b7519 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.090 229250 DEBUG oslo_concurrency.lockutils [req-f7780de6-13e5-48fa-a517-9ff03daeee07 req-c97752c2-df91-41a8-807d-bfba960b7519 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.090 229250 DEBUG nova.compute.manager [req-f7780de6-13e5-48fa-a517-9ff03daeee07 req-c97752c2-df91-41a8-807d-bfba960b7519 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] No waiting events found dispatching network-vif-unplugged-9b30064b-3b78-4be4-bd9a-743cb550a78d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.091 229250 DEBUG nova.compute.manager [req-f7780de6-13e5-48fa-a517-9ff03daeee07 req-c97752c2-df91-41a8-807d-bfba960b7519 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Received event network-vif-unplugged-9b30064b-3b78-4be4-bd9a-743cb550a78d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  6 05:06:48 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:48.103 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[f402ed74-3323-438b-bbab-74d8cc39948e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:48 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:48.104 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[3d278a28-4e64-4f40-882a-b891d2bedbb4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:48 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:48.117 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[8f84a1d5-84de-4731-b61c-3a03bbd87531]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396215, 'reachable_time': 36226, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233598, 'error': None, 'target': 'ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:48 np0005548918 systemd[1]: run-netns-ovnmeta\x2db5acf2ad\x2dba64\x2d4833\x2dba3e\x2dfc2228d0500f.mount: Deactivated successfully.
Dec  6 05:06:48 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:48.131 141754 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b5acf2ad-ba64-4833-ba3e-fc2228d0500f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  6 05:06:48 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:48.132 141754 DEBUG oslo.privsep.daemon [-] privsep: reply[8da0dcd4-9d0a-4254-8ad4-1c1be8fa6dac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.210 229250 INFO nova.virt.libvirt.driver [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Deleting instance files /var/lib/nova/instances/a6718b79-80e4-4d66-bff2-537e642a14f9_del#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.211 229250 INFO nova.virt.libvirt.driver [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Deletion of /var/lib/nova/instances/a6718b79-80e4-4d66-bff2-537e642a14f9_del complete#033[00m
Dec  6 05:06:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.286 229250 DEBUG nova.virt.libvirt.host [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.287 229250 INFO nova.virt.libvirt.host [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] UEFI support detected#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.289 229250 INFO nova.compute.manager [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Took 2.18 seconds to destroy the instance on the hypervisor.#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.289 229250 DEBUG oslo.service.loopingcall [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.289 229250 DEBUG nova.compute.manager [-] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.290 229250 DEBUG nova.network.neutron [-] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  6 05:06:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:06:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:48.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:06:48 np0005548918 kernel: ganesha.nfsd[233320]: segfault at 50 ip 00007fa42287132e sp 00007fa3f1ffa210 error 4 in libntirpc.so.5.8[7fa422856000+2c000] likely on CPU 5 (core 0, socket 5)
Dec  6 05:06:48 np0005548918 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 05:06:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[232296]: 06/12/2025 10:06:48 : epoch 6933ffe1 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa340002b10 fd 39 proxy ignored for local
Dec  6 05:06:48 np0005548918 systemd[1]: Started Process Core Dump (PID 233600/UID 0).
Dec  6 05:06:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.964 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.967 229250 DEBUG nova.network.neutron [-] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:06:48 np0005548918 nova_compute[229246]: 2025-12-06 10:06:48.981 229250 INFO nova.compute.manager [-] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Took 0.69 seconds to deallocate network for instance.#033[00m
Dec  6 05:06:49 np0005548918 nova_compute[229246]: 2025-12-06 10:06:49.029 229250 DEBUG oslo_concurrency.lockutils [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:49 np0005548918 nova_compute[229246]: 2025-12-06 10:06:49.030 229250 DEBUG oslo_concurrency.lockutils [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:49 np0005548918 nova_compute[229246]: 2025-12-06 10:06:49.073 229250 DEBUG oslo_concurrency.processutils [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:06:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:49 np0005548918 nova_compute[229246]: 2025-12-06 10:06:49.516 229250 DEBUG oslo_concurrency.processutils [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:06:49 np0005548918 nova_compute[229246]: 2025-12-06 10:06:49.521 229250 DEBUG nova.compute.provider_tree [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:06:49 np0005548918 nova_compute[229246]: 2025-12-06 10:06:49.540 229250 DEBUG nova.scheduler.client.report [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:06:49 np0005548918 nova_compute[229246]: 2025-12-06 10:06:49.562 229250 DEBUG oslo_concurrency.lockutils [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:49 np0005548918 nova_compute[229246]: 2025-12-06 10:06:49.598 229250 INFO nova.scheduler.client.report [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Deleted allocations for instance a6718b79-80e4-4d66-bff2-537e642a14f9#033[00m
Dec  6 05:06:49 np0005548918 nova_compute[229246]: 2025-12-06 10:06:49.675 229250 DEBUG oslo_concurrency.lockutils [None req-1b3b158b-5be5-47e2-ab69-68c5e8b96039 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:49 np0005548918 systemd-coredump[233601]: Process 232300 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 59:#012#0  0x00007fa42287132e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 05:06:49 np0005548918 systemd[1]: systemd-coredump@6-233600-0.service: Deactivated successfully.
Dec  6 05:06:49 np0005548918 systemd[1]: systemd-coredump@6-233600-0.service: Consumed 1.012s CPU time.
Dec  6 05:06:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:49 np0005548918 podman[233630]: 2025-12-06 10:06:49.94252205 +0000 UTC m=+0.034585149 container died 54ee4eab8c62c38b4bde619ddd33181c6187b2f7e5025eaebc667b3481ee5955 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Dec  6 05:06:49 np0005548918 systemd[1]: var-lib-containers-storage-overlay-ccc7b516cfb488e4a02550397d0293a2c7677aa88723e02dc64e3567916ebb90-merged.mount: Deactivated successfully.
Dec  6 05:06:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:49.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:49 np0005548918 podman[233630]: 2025-12-06 10:06:49.991655166 +0000 UTC m=+0.083718195 container remove 54ee4eab8c62c38b4bde619ddd33181c6187b2f7e5025eaebc667b3481ee5955 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Dec  6 05:06:50 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Main process exited, code=exited, status=139/n/a
Dec  6 05:06:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:50 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Failed with result 'exit-code'.
Dec  6 05:06:50 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.376s CPU time.
Dec  6 05:06:50 np0005548918 nova_compute[229246]: 2025-12-06 10:06:50.176 229250 DEBUG nova.compute.manager [req-5e243090-2473-4a6c-b57c-0eed6fa2dcc0 req-d6a587c5-14c2-4fa3-bd68-229764174c6b d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Received event network-vif-plugged-9b30064b-3b78-4be4-bd9a-743cb550a78d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:06:50 np0005548918 nova_compute[229246]: 2025-12-06 10:06:50.177 229250 DEBUG oslo_concurrency.lockutils [req-5e243090-2473-4a6c-b57c-0eed6fa2dcc0 req-d6a587c5-14c2-4fa3-bd68-229764174c6b d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:50 np0005548918 nova_compute[229246]: 2025-12-06 10:06:50.177 229250 DEBUG oslo_concurrency.lockutils [req-5e243090-2473-4a6c-b57c-0eed6fa2dcc0 req-d6a587c5-14c2-4fa3-bd68-229764174c6b d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:50 np0005548918 nova_compute[229246]: 2025-12-06 10:06:50.178 229250 DEBUG oslo_concurrency.lockutils [req-5e243090-2473-4a6c-b57c-0eed6fa2dcc0 req-d6a587c5-14c2-4fa3-bd68-229764174c6b d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "a6718b79-80e4-4d66-bff2-537e642a14f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:50 np0005548918 nova_compute[229246]: 2025-12-06 10:06:50.178 229250 DEBUG nova.compute.manager [req-5e243090-2473-4a6c-b57c-0eed6fa2dcc0 req-d6a587c5-14c2-4fa3-bd68-229764174c6b d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] No waiting events found dispatching network-vif-plugged-9b30064b-3b78-4be4-bd9a-743cb550a78d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:06:50 np0005548918 nova_compute[229246]: 2025-12-06 10:06:50.179 229250 WARNING nova.compute.manager [req-5e243090-2473-4a6c-b57c-0eed6fa2dcc0 req-d6a587c5-14c2-4fa3-bd68-229764174c6b d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Received unexpected event network-vif-plugged-9b30064b-3b78-4be4-bd9a-743cb550a78d for instance with vm_state deleted and task_state None.#033[00m
Dec  6 05:06:50 np0005548918 nova_compute[229246]: 2025-12-06 10:06:50.179 229250 DEBUG nova.compute.manager [req-5e243090-2473-4a6c-b57c-0eed6fa2dcc0 req-d6a587c5-14c2-4fa3-bd68-229764174c6b d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Received event network-vif-deleted-9b30064b-3b78-4be4-bd9a-743cb550a78d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:06:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:50.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:51.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:52 np0005548918 nova_compute[229246]: 2025-12-06 10:06:52.472 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:52.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:52 np0005548918 nova_compute[229246]: 2025-12-06 10:06:52.578 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:52 np0005548918 nova_compute[229246]: 2025-12-06 10:06:52.795 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:53.672 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:06:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:53.673 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:06:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:06:53.673 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:06:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:53 np0005548918 nova_compute[229246]: 2025-12-06 10:06:53.967 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:53.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:54.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:06:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100655 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:06:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:55.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  6 05:06:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:56.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  6 05:06:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:57 np0005548918 nova_compute[229246]: 2025-12-06 10:06:57.798 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:57.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:58.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:06:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:58 np0005548918 nova_compute[229246]: 2025-12-06 10:06:58.969 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:06:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:06:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:06:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:06:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:06:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:06:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:59.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:00 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Scheduled restart job, restart counter is at 7.
Dec  6 05:07:00 np0005548918 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:07:00 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.376s CPU time.
Dec  6 05:07:00 np0005548918 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 05:07:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:00.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:00 np0005548918 podman[233754]: 2025-12-06 10:07:00.56465992 +0000 UTC m=+0.044729016 container create ac64860b815acf5b7beef1497de59798554a71947748d7def201dca58b1b4a6a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec  6 05:07:00 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4289d5edb023c7dbe91d5e418954f3f6566f52155e4cf4fe497941c8d806b00/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 05:07:00 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4289d5edb023c7dbe91d5e418954f3f6566f52155e4cf4fe497941c8d806b00/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 05:07:00 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4289d5edb023c7dbe91d5e418954f3f6566f52155e4cf4fe497941c8d806b00/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:07:00 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4289d5edb023c7dbe91d5e418954f3f6566f52155e4cf4fe497941c8d806b00/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.sseuqb-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:07:00 np0005548918 podman[233754]: 2025-12-06 10:07:00.62777529 +0000 UTC m=+0.107844406 container init ac64860b815acf5b7beef1497de59798554a71947748d7def201dca58b1b4a6a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 05:07:00 np0005548918 podman[233754]: 2025-12-06 10:07:00.632638533 +0000 UTC m=+0.112707629 container start ac64860b815acf5b7beef1497de59798554a71947748d7def201dca58b1b4a6a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 05:07:00 np0005548918 bash[233754]: ac64860b815acf5b7beef1497de59798554a71947748d7def201dca58b1b4a6a
Dec  6 05:07:00 np0005548918 podman[233754]: 2025-12-06 10:07:00.545380552 +0000 UTC m=+0.025449698 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 05:07:00 np0005548918 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:07:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:00 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 05:07:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:00 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 05:07:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:00 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 05:07:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:00 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 05:07:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:00 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 05:07:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:00 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 05:07:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:00 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 05:07:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:00 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:07:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:01 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:07:01 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:07:01 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:07:01 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:07:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:01.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:07:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:02.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:07:02 np0005548918 nova_compute[229246]: 2025-12-06 10:07:02.551 229250 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765015607.550105, a6718b79-80e4-4d66-bff2-537e642a14f9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:07:02 np0005548918 nova_compute[229246]: 2025-12-06 10:07:02.551 229250 INFO nova.compute.manager [-] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] VM Stopped (Lifecycle Event)#033[00m
Dec  6 05:07:02 np0005548918 nova_compute[229246]: 2025-12-06 10:07:02.623 229250 DEBUG nova.compute.manager [None req-a74b917e-33ea-41de-91e9-6db01e4f8c70 - - - - - -] [instance: a6718b79-80e4-4d66-bff2-537e642a14f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:07:02 np0005548918 nova_compute[229246]: 2025-12-06 10:07:02.799 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:03 np0005548918 nova_compute[229246]: 2025-12-06 10:07:03.972 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:03.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:04 np0005548918 podman[233896]: 2025-12-06 10:07:04.199137106 +0000 UTC m=+0.082934063 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller)
Dec  6 05:07:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:04.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:05.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:06.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:06 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec  6 05:07:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:06 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec  6 05:07:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:06 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:07:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:06 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:07:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:06 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:07:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:06 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:07:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:06 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:07:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:06 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:07:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:07:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:07:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:07 np0005548918 nova_compute[229246]: 2025-12-06 10:07:07.802 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:07.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:08.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:08 np0005548918 nova_compute[229246]: 2025-12-06 10:07:08.974 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:09.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100710 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:07:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:10.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:11 np0005548918 podman[233979]: 2025-12-06 10:07:11.865662161 +0000 UTC m=+0.063101409 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  6 05:07:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:11.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:12.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:12 np0005548918 nova_compute[229246]: 2025-12-06 10:07:12.803 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-000000000000001b:nfs.cephfs.1: -2
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:07:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:12 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:07:12.950 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:07:12 np0005548918 nova_compute[229246]: 2025-12-06 10:07:12.951 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:12 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:07:12.951 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:07:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:13 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:13 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3800016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:13 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:07:13.954 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:07:13 np0005548918 nova_compute[229246]: 2025-12-06 10:07:13.976 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:07:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:14.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:07:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:14.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:14 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:15 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100715 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:07:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:15 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc36c000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:16.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:16.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:16 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3800016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:17 np0005548918 podman[234019]: 2025-12-06 10:07:17.192955951 +0000 UTC m=+0.075112379 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2)
Dec  6 05:07:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:17 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3680016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:17 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3600016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:17 np0005548918 nova_compute[229246]: 2025-12-06 10:07:17.806 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:18.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:07:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:18.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:07:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:18 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3680016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:18 np0005548918 nova_compute[229246]: 2025-12-06 10:07:18.979 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:19 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3800016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:19 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc36c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:20.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:20.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:20 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3600016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:21 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3680016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:21 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3800016c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:22.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:22.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:22 np0005548918 nova_compute[229246]: 2025-12-06 10:07:22.810 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:22 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3680016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:23 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3800016c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:23 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:23.920682) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643920719, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1207, "num_deletes": 251, "total_data_size": 2821937, "memory_usage": 2847312, "flush_reason": "Manual Compaction"}
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Dec  6 05:07:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643937510, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1820143, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24870, "largest_seqno": 26072, "table_properties": {"data_size": 1815003, "index_size": 2600, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11653, "raw_average_key_size": 19, "raw_value_size": 1804422, "raw_average_value_size": 3084, "num_data_blocks": 116, "num_entries": 585, "num_filter_entries": 585, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015553, "oldest_key_time": 1765015553, "file_creation_time": 1765015643, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 16882 microseconds, and 4403 cpu microseconds.
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:23.937561) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1820143 bytes OK
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:23.937580) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:23.940363) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:23.940378) EVENT_LOG_v1 {"time_micros": 1765015643940374, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:23.940397) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 2816081, prev total WAL file size 2816081, number of live WAL files 2.
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:23.941545) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1777KB)], [48(13MB)]
Dec  6 05:07:23 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643941650, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 15534694, "oldest_snapshot_seqno": -1}
Dec  6 05:07:23 np0005548918 nova_compute[229246]: 2025-12-06 10:07:23.980 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:07:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:24.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5432 keys, 13333954 bytes, temperature: kUnknown
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015644112589, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 13333954, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13297693, "index_size": 21559, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13637, "raw_key_size": 139424, "raw_average_key_size": 25, "raw_value_size": 13199285, "raw_average_value_size": 2429, "num_data_blocks": 875, "num_entries": 5432, "num_filter_entries": 5432, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765015643, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:24.112943) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 13333954 bytes
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:24.114532) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 90.8 rd, 78.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.1 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(15.9) write-amplify(7.3) OK, records in: 5949, records dropped: 517 output_compression: NoCompression
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:24.114566) EVENT_LOG_v1 {"time_micros": 1765015644114551, "job": 28, "event": "compaction_finished", "compaction_time_micros": 171033, "compaction_time_cpu_micros": 50067, "output_level": 6, "num_output_files": 1, "total_output_size": 13333954, "num_input_records": 5949, "num_output_records": 5432, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015644115527, "job": 28, "event": "table_file_deletion", "file_number": 50}
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015644120925, "job": 28, "event": "table_file_deletion", "file_number": 48}
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:23.941416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:24.121270) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:24.121291) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:24.121295) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:24.121298) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:07:24 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:07:24.121301) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:07:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:07:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:24.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:07:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:24 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3680016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:25 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3800016c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:25 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:26.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:26.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:26 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:27 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3680016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:27 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc36c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:27 np0005548918 nova_compute[229246]: 2025-12-06 10:07:27.813 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:28.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:28.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:28 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:28 np0005548918 nova_compute[229246]: 2025-12-06 10:07:28.982 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:29 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:29 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3680036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:30.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:30.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:30 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc36c002580 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:31 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:31 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:32.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:07:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:32.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:07:32 np0005548918 nova_compute[229246]: 2025-12-06 10:07:32.816 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:32 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3680036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:33 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc36c003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:33 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:33 np0005548918 nova_compute[229246]: 2025-12-06 10:07:33.984 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:34.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:34.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:34 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:35 np0005548918 podman[234083]: 2025-12-06 10:07:35.302296253 +0000 UTC m=+0.177317509 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  6 05:07:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:35 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3680036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:35 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc36c003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:36.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:36.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:36 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:37 np0005548918 ovn_controller[132371]: 2025-12-06T10:07:37Z|00042|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec  6 05:07:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:37 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:37 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:37 np0005548918 nova_compute[229246]: 2025-12-06 10:07:37.818 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:38.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:38.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:38 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc36c003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:38 np0005548918 nova_compute[229246]: 2025-12-06 10:07:38.986 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:39 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3800016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:39 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:40.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:40.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:40 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc36c003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:40 np0005548918 nova_compute[229246]: 2025-12-06 10:07:40.985 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:41 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:41 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:41 np0005548918 nova_compute[229246]: 2025-12-06 10:07:41.532 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:41 np0005548918 nova_compute[229246]: 2025-12-06 10:07:41.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:41 np0005548918 nova_compute[229246]: 2025-12-06 10:07:41.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:41 np0005548918 nova_compute[229246]: 2025-12-06 10:07:41.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:07:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:42.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:42 np0005548918 podman[234118]: 2025-12-06 10:07:42.196003346 +0000 UTC m=+0.070046994 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  6 05:07:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:42 np0005548918 nova_compute[229246]: 2025-12-06 10:07:42.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:42 np0005548918 nova_compute[229246]: 2025-12-06 10:07:42.537 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:07:42 np0005548918 nova_compute[229246]: 2025-12-06 10:07:42.537 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:07:42 np0005548918 nova_compute[229246]: 2025-12-06 10:07:42.560 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:07:42 np0005548918 nova_compute[229246]: 2025-12-06 10:07:42.561 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:42 np0005548918 nova_compute[229246]: 2025-12-06 10:07:42.561 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:42.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:42 np0005548918 nova_compute[229246]: 2025-12-06 10:07:42.821 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:42 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:43 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3800016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:43 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:43 np0005548918 nova_compute[229246]: 2025-12-06 10:07:43.534 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:43 np0005548918 nova_compute[229246]: 2025-12-06 10:07:43.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:07:43 np0005548918 nova_compute[229246]: 2025-12-06 10:07:43.573 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:07:43 np0005548918 nova_compute[229246]: 2025-12-06 10:07:43.574 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:07:43 np0005548918 nova_compute[229246]: 2025-12-06 10:07:43.574 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:07:43 np0005548918 nova_compute[229246]: 2025-12-06 10:07:43.574 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:07:43 np0005548918 nova_compute[229246]: 2025-12-06 10:07:43.574 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:07:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:07:44 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/399366570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:07:44 np0005548918 nova_compute[229246]: 2025-12-06 10:07:44.036 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:44.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:44 np0005548918 nova_compute[229246]: 2025-12-06 10:07:44.048 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:07:44 np0005548918 nova_compute[229246]: 2025-12-06 10:07:44.193 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:07:44 np0005548918 nova_compute[229246]: 2025-12-06 10:07:44.195 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4952MB free_disk=59.94289016723633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:07:44 np0005548918 nova_compute[229246]: 2025-12-06 10:07:44.195 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:07:44 np0005548918 nova_compute[229246]: 2025-12-06 10:07:44.195 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:07:44 np0005548918 nova_compute[229246]: 2025-12-06 10:07:44.245 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:07:44 np0005548918 nova_compute[229246]: 2025-12-06 10:07:44.245 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:07:44 np0005548918 nova_compute[229246]: 2025-12-06 10:07:44.269 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:07:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:44.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:07:44 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3598102245' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:07:44 np0005548918 nova_compute[229246]: 2025-12-06 10:07:44.705 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:07:44 np0005548918 nova_compute[229246]: 2025-12-06 10:07:44.713 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:07:44 np0005548918 nova_compute[229246]: 2025-12-06 10:07:44.728 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:07:44 np0005548918 nova_compute[229246]: 2025-12-06 10:07:44.762 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:07:44 np0005548918 nova_compute[229246]: 2025-12-06 10:07:44.763 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:07:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:44 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:45 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c001110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:45 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:46.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:46.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:46 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:47 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:47 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c001110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:47 np0005548918 nova_compute[229246]: 2025-12-06 10:07:47.823 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:48 np0005548918 podman[234190]: 2025-12-06 10:07:48.179146545 +0000 UTC m=+0.069987253 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  6 05:07:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:48.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:48.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:48 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:49 np0005548918 nova_compute[229246]: 2025-12-06 10:07:49.037 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:49 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3540016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:49 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:50.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:50.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:50 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3540016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:51 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003e60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:51 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:52.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:52.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:52 np0005548918 nova_compute[229246]: 2025-12-06 10:07:52.826 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:52 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003e60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:53 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:53 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3540016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:07:53.673 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:07:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:07:53.674 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:07:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:07:53.674 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:07:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:54 np0005548918 nova_compute[229246]: 2025-12-06 10:07:54.041 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:54.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:07:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:54.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:07:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:54 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003e60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:07:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:55 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:55 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:56.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:56.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:56 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3540016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:57 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:57 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:57 np0005548918 nova_compute[229246]: 2025-12-06 10:07:57.829 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:07:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:58.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:07:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:07:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:07:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:58.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:07:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:58 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c002bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:59 np0005548918 nova_compute[229246]: 2025-12-06 10:07:59.041 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:07:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:07:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:07:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:59 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:07:59 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:07:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:07:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:00.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:00.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:00 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:01 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:01 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c002bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:02.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:02.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:02 np0005548918 nova_compute[229246]: 2025-12-06 10:08:02.831 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:02 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c002bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:03 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:03 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c002bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:04 np0005548918 nova_compute[229246]: 2025-12-06 10:08:04.043 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:04.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:04.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:04 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:05 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:05 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:06 np0005548918 podman[234254]: 2025-12-06 10:08:06.229921409 +0000 UTC m=+0.117305348 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:08:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:06.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:06.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:06 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c002bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:07 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:07 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:07 np0005548918 nova_compute[229246]: 2025-12-06 10:08:07.833 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:08 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:08:08 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:08:08 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:08:08 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:08:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:08:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:08.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:08:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:08.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:08 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:09 np0005548918 nova_compute[229246]: 2025-12-06 10:08:09.045 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:09 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c004300 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:09 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:10.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:10.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:10 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:11 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:11 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c004300 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:12.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:12.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c004300 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:12 np0005548918 nova_compute[229246]: 2025-12-06 10:08:12.871 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:13 np0005548918 podman[234394]: 2025-12-06 10:08:13.160074443 +0000 UTC m=+0.050478721 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 05:08:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:13 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c004300 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:13 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:14 np0005548918 nova_compute[229246]: 2025-12-06 10:08:14.104 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:14.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:14.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:14 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:14 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:08:14.924 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:08:14 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:08:14.925 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:08:14 np0005548918 nova_compute[229246]: 2025-12-06 10:08:14.924 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:15 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:15 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:16 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:08:16 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:08:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:16.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:16.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:16 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:17 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:17 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:17 np0005548918 nova_compute[229246]: 2025-12-06 10:08:17.874 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:18.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:18.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:18 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3500016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:19 np0005548918 nova_compute[229246]: 2025-12-06 10:08:19.106 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:19 np0005548918 podman[234449]: 2025-12-06 10:08:19.194576844 +0000 UTC m=+0.065011690 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  6 05:08:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:19 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3800008d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:19 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c001e80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:20.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:20.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:20 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3500016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:21 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:21 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3800008d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:22.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:22.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:22 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:22 np0005548918 nova_compute[229246]: 2025-12-06 10:08:22.898 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:23 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3500016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:23 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:24 np0005548918 nova_compute[229246]: 2025-12-06 10:08:24.150 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:24 np0005548918 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec  6 05:08:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.002000053s ======
Dec  6 05:08:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:24.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Dec  6 05:08:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:24.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:24 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:24 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:08:24.926 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:08:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:25 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:25 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:26.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:08:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:26.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:08:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:26 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:27 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:27 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:27 np0005548918 nova_compute[229246]: 2025-12-06 10:08:27.902 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:28.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:28.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:28 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:29 np0005548918 nova_compute[229246]: 2025-12-06 10:08:29.153 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:29 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:29 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3800027a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:30.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:30.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:30 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:31 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:31 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c0030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:32.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:32.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:32 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3800027a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:32 np0005548918 nova_compute[229246]: 2025-12-06 10:08:32.905 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:33 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:33 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:34 np0005548918 nova_compute[229246]: 2025-12-06 10:08:34.156 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:34.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:34.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:34 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c0030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:35 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3800027a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:35 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:08:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:36.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:08:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:36.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:36 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:37 np0005548918 podman[234513]: 2025-12-06 10:08:37.202396648 +0000 UTC m=+0.088983590 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  6 05:08:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:37 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:37 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3800027a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:37 np0005548918 nova_compute[229246]: 2025-12-06 10:08:37.908 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:38.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:38.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:38 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:39 np0005548918 nova_compute[229246]: 2025-12-06 10:08:39.159 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:39 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:39 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:39 np0005548918 nova_compute[229246]: 2025-12-06 10:08:39.763 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:40.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:40.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:40 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:41 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:41 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:41 np0005548918 nova_compute[229246]: 2025-12-06 10:08:41.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:41 np0005548918 nova_compute[229246]: 2025-12-06 10:08:41.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:41 np0005548918 nova_compute[229246]: 2025-12-06 10:08:41.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:08:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:42 np0005548918 nova_compute[229246]: 2025-12-06 10:08:42.531 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:42 np0005548918 nova_compute[229246]: 2025-12-06 10:08:42.531 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:42 np0005548918 nova_compute[229246]: 2025-12-06 10:08:42.547 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:42.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:42.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:42 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:42 np0005548918 nova_compute[229246]: 2025-12-06 10:08:42.911 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:43 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:43 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:44 np0005548918 nova_compute[229246]: 2025-12-06 10:08:44.162 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:44 np0005548918 podman[234546]: 2025-12-06 10:08:44.182981749 +0000 UTC m=+0.067985159 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:08:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100844 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:08:44 np0005548918 nova_compute[229246]: 2025-12-06 10:08:44.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:44 np0005548918 nova_compute[229246]: 2025-12-06 10:08:44.535 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:08:44 np0005548918 nova_compute[229246]: 2025-12-06 10:08:44.535 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:08:44 np0005548918 nova_compute[229246]: 2025-12-06 10:08:44.563 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:08:44 np0005548918 nova_compute[229246]: 2025-12-06 10:08:44.564 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:44.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:44.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:44 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:45 np0005548918 nova_compute[229246]: 2025-12-06 10:08:45.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:45 np0005548918 nova_compute[229246]: 2025-12-06 10:08:45.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:08:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:45 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:45 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:45 np0005548918 nova_compute[229246]: 2025-12-06 10:08:45.558 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:08:45 np0005548918 nova_compute[229246]: 2025-12-06 10:08:45.558 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:08:45 np0005548918 nova_compute[229246]: 2025-12-06 10:08:45.559 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:08:45 np0005548918 nova_compute[229246]: 2025-12-06 10:08:45.559 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:08:45 np0005548918 nova_compute[229246]: 2025-12-06 10:08:45.560 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:08:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  6 05:08:45 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4086049669' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  6 05:08:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  6 05:08:45 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4086049669' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  6 05:08:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:08:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2148744341' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:08:46 np0005548918 nova_compute[229246]: 2025-12-06 10:08:46.043 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:08:46 np0005548918 nova_compute[229246]: 2025-12-06 10:08:46.184 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:08:46 np0005548918 nova_compute[229246]: 2025-12-06 10:08:46.185 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4928MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:08:46 np0005548918 nova_compute[229246]: 2025-12-06 10:08:46.185 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:08:46 np0005548918 nova_compute[229246]: 2025-12-06 10:08:46.185 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:08:46 np0005548918 nova_compute[229246]: 2025-12-06 10:08:46.251 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:08:46 np0005548918 nova_compute[229246]: 2025-12-06 10:08:46.251 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:08:46 np0005548918 nova_compute[229246]: 2025-12-06 10:08:46.270 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:08:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:46.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:46.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:08:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1552448253' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:08:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:46 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:46 np0005548918 nova_compute[229246]: 2025-12-06 10:08:46.981 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.711s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:08:46 np0005548918 nova_compute[229246]: 2025-12-06 10:08:46.986 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:08:47 np0005548918 nova_compute[229246]: 2025-12-06 10:08:47.004 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:08:47 np0005548918 nova_compute[229246]: 2025-12-06 10:08:47.005 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:08:47 np0005548918 nova_compute[229246]: 2025-12-06 10:08:47.006 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.821s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:08:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:47 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:47 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:47 np0005548918 nova_compute[229246]: 2025-12-06 10:08:47.983 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:48.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:48.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:48 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:49 np0005548918 nova_compute[229246]: 2025-12-06 10:08:49.200 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:49 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:49 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:50 np0005548918 podman[234617]: 2025-12-06 10:08:50.176524475 +0000 UTC m=+0.062290398 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3)
Dec  6 05:08:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:50.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:50.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:50 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:51 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:51 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:52.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:52.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:52 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:53 np0005548918 nova_compute[229246]: 2025-12-06 10:08:53.039 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:53 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:53 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:08:53.674 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:08:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:08:53.675 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:08:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:08:53.675 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:08:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:54 : epoch 69340044 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:08:54 np0005548918 nova_compute[229246]: 2025-12-06 10:08:54.241 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:54.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:54.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:54 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:08:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:55 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c002780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:55 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:56.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:56.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:56 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:57 : epoch 69340044 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:08:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:57 : epoch 69340044 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:08:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:57 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:57 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c002780 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:58 np0005548918 nova_compute[229246]: 2025-12-06 10:08:58.042 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:08:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:58.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:08:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:08:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:08:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:58.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:08:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:58 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:59 np0005548918 nova_compute[229246]: 2025-12-06 10:08:59.243 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:08:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:08:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:08:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:59 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:08:59 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003dd0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:08:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:08:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:00 : epoch 69340044 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:09:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.002000053s ======
Dec  6 05:09:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:00.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Dec  6 05:09:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:00.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:00 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c002780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:01 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:01 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:09:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:02.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:09:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:02.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:02 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:03 np0005548918 nova_compute[229246]: 2025-12-06 10:09:03.045 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:03 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:03 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:04 np0005548918 nova_compute[229246]: 2025-12-06 10:09:04.262 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:09:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:04.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:09:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:04.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:04 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:05 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:05 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100906 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:09:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:06.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:06.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:06 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:07 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:07 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354003dd0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:08 np0005548918 nova_compute[229246]: 2025-12-06 10:09:08.047 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:08 np0005548918 podman[234685]: 2025-12-06 10:09:08.193060958 +0000 UTC m=+0.081772954 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:09:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:08.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:08.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:08 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:09 np0005548918 nova_compute[229246]: 2025-12-06 10:09:09.263 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:09 np0005548918 nova_compute[229246]: 2025-12-06 10:09:09.486 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "f06564b9-c316-4167-8633-00e6af858c3b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:09 np0005548918 nova_compute[229246]: 2025-12-06 10:09:09.486 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:09 np0005548918 nova_compute[229246]: 2025-12-06 10:09:09.506 229250 DEBUG nova.compute.manager [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  6 05:09:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:09 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:09 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:09 np0005548918 nova_compute[229246]: 2025-12-06 10:09:09.635 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:09 np0005548918 nova_compute[229246]: 2025-12-06 10:09:09.636 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:09 np0005548918 nova_compute[229246]: 2025-12-06 10:09:09.651 229250 DEBUG nova.virt.hardware [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  6 05:09:09 np0005548918 nova_compute[229246]: 2025-12-06 10:09:09.652 229250 INFO nova.compute.claims [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec  6 05:09:09 np0005548918 nova_compute[229246]: 2025-12-06 10:09:09.772 229250 DEBUG oslo_concurrency.processutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:09:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:09:10 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/465585946' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.268 229250 DEBUG oslo_concurrency.processutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.274 229250 DEBUG nova.compute.provider_tree [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.291 229250 DEBUG nova.scheduler.client.report [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.310 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.311 229250 DEBUG nova.compute.manager [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  6 05:09:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.360 229250 DEBUG nova.compute.manager [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.361 229250 DEBUG nova.network.neutron [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.388 229250 INFO nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.408 229250 DEBUG nova.compute.manager [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.492 229250 DEBUG nova.compute.manager [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.493 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.494 229250 INFO nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Creating image(s)#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.523 229250 DEBUG nova.storage.rbd_utils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image f06564b9-c316-4167-8633-00e6af858c3b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.555 229250 DEBUG nova.storage.rbd_utils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image f06564b9-c316-4167-8633-00e6af858c3b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.580 229250 DEBUG nova.storage.rbd_utils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image f06564b9-c316-4167-8633-00e6af858c3b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.584 229250 DEBUG oslo_concurrency.processutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.635 229250 DEBUG oslo_concurrency.processutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.636 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "1b7208203e670301d076a006cb3364d3eb842050" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.637 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "1b7208203e670301d076a006cb3364d3eb842050" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.637 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "1b7208203e670301d076a006cb3364d3eb842050" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:10.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:10.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.667 229250 DEBUG nova.storage.rbd_utils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image f06564b9-c316-4167-8633-00e6af858c3b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.671 229250 DEBUG oslo_concurrency.processutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050 f06564b9-c316-4167-8633-00e6af858c3b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:09:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:10 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:10 np0005548918 nova_compute[229246]: 2025-12-06 10:09:10.977 229250 DEBUG oslo_concurrency.processutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050 f06564b9-c316-4167-8633-00e6af858c3b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.306s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:09:11 np0005548918 nova_compute[229246]: 2025-12-06 10:09:11.059 229250 DEBUG nova.storage.rbd_utils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] resizing rbd image f06564b9-c316-4167-8633-00e6af858c3b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Dec  6 05:09:11 np0005548918 nova_compute[229246]: 2025-12-06 10:09:11.109 229250 DEBUG nova.policy [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '03615580775245e6ae335ee9d785611f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '92b402c8d3e2476abc98be42a1e6d34e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  6 05:09:11 np0005548918 nova_compute[229246]: 2025-12-06 10:09:11.166 229250 DEBUG nova.objects.instance [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lazy-loading 'migration_context' on Instance uuid f06564b9-c316-4167-8633-00e6af858c3b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  6 05:09:11 np0005548918 nova_compute[229246]: 2025-12-06 10:09:11.181 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  6 05:09:11 np0005548918 nova_compute[229246]: 2025-12-06 10:09:11.182 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Ensure instance console log exists: /var/lib/nova/instances/f06564b9-c316-4167-8633-00e6af858c3b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  6 05:09:11 np0005548918 nova_compute[229246]: 2025-12-06 10:09:11.182 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:11 np0005548918 nova_compute[229246]: 2025-12-06 10:09:11.183 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:11 np0005548918 nova_compute[229246]: 2025-12-06 10:09:11.183 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:11 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:11 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:12.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:12.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:12 np0005548918 nova_compute[229246]: 2025-12-06 10:09:12.734 229250 DEBUG nova.network.neutron [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Successfully created port: dada49f9-c005-4fb8-845a-9a4e59c236ce _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  6 05:09:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:13 np0005548918 nova_compute[229246]: 2025-12-06 10:09:13.049 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:13 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:13 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 50 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:13 np0005548918 nova_compute[229246]: 2025-12-06 10:09:13.990 229250 DEBUG nova.network.neutron [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Successfully updated port: dada49f9-c005-4fb8-845a-9a4e59c236ce _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  6 05:09:14 np0005548918 nova_compute[229246]: 2025-12-06 10:09:14.008 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "refresh_cache-f06564b9-c316-4167-8633-00e6af858c3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:09:14 np0005548918 nova_compute[229246]: 2025-12-06 10:09:14.009 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquired lock "refresh_cache-f06564b9-c316-4167-8633-00e6af858c3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:09:14 np0005548918 nova_compute[229246]: 2025-12-06 10:09:14.009 229250 DEBUG nova.network.neutron [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  6 05:09:14 np0005548918 nova_compute[229246]: 2025-12-06 10:09:14.095 229250 DEBUG nova.compute.manager [req-ab9fa797-c337-4f0a-8da0-d5825e084d06 req-cdced5cd-d530-4244-be47-e86cfafdd0e4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Received event network-changed-dada49f9-c005-4fb8-845a-9a4e59c236ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:09:14 np0005548918 nova_compute[229246]: 2025-12-06 10:09:14.095 229250 DEBUG nova.compute.manager [req-ab9fa797-c337-4f0a-8da0-d5825e084d06 req-cdced5cd-d530-4244-be47-e86cfafdd0e4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Refreshing instance network info cache due to event network-changed-dada49f9-c005-4fb8-845a-9a4e59c236ce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  6 05:09:14 np0005548918 nova_compute[229246]: 2025-12-06 10:09:14.096 229250 DEBUG oslo_concurrency.lockutils [req-ab9fa797-c337-4f0a-8da0-d5825e084d06 req-cdced5cd-d530-4244-be47-e86cfafdd0e4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "refresh_cache-f06564b9-c316-4167-8633-00e6af858c3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:09:14 np0005548918 nova_compute[229246]: 2025-12-06 10:09:14.189 229250 DEBUG nova.network.neutron [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  6 05:09:14 np0005548918 nova_compute[229246]: 2025-12-06 10:09:14.265 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:14.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:14.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:14 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.141 229250 DEBUG nova.network.neutron [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Updating instance_info_cache with network_info: [{"id": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "address": "fa:16:3e:85:82:b5", "network": {"id": "dccd9941-4f3e-4086-b9cd-651d8e99e8ec", "bridge": "br-int", "label": "tempest-network-smoke--1290241953", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdada49f9-c0", "ovs_interfaceid": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.161 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Releasing lock "refresh_cache-f06564b9-c316-4167-8633-00e6af858c3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.161 229250 DEBUG nova.compute.manager [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Instance network_info: |[{"id": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "address": "fa:16:3e:85:82:b5", "network": {"id": "dccd9941-4f3e-4086-b9cd-651d8e99e8ec", "bridge": "br-int", "label": "tempest-network-smoke--1290241953", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdada49f9-c0", "ovs_interfaceid": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.162 229250 DEBUG oslo_concurrency.lockutils [req-ab9fa797-c337-4f0a-8da0-d5825e084d06 req-cdced5cd-d530-4244-be47-e86cfafdd0e4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquired lock "refresh_cache-f06564b9-c316-4167-8633-00e6af858c3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.163 229250 DEBUG nova.network.neutron [req-ab9fa797-c337-4f0a-8da0-d5825e084d06 req-cdced5cd-d530-4244-be47-e86cfafdd0e4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Refreshing network info cache for port dada49f9-c005-4fb8-845a-9a4e59c236ce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.168 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Start _get_guest_xml network_info=[{"id": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "address": "fa:16:3e:85:82:b5", "network": {"id": "dccd9941-4f3e-4086-b9cd-651d8e99e8ec", "bridge": "br-int", "label": "tempest-network-smoke--1290241953", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdada49f9-c0", "ovs_interfaceid": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:04:42Z,direct_url=<?>,disk_format='qcow2',id=9489b8a5-a798-4e26-87f9-59bb1eb2e6fd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3e0ab101ca7547d4a515169a0f2edef3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-06T10:04:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '9489b8a5-a798-4e26-87f9-59bb1eb2e6fd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.176 229250 WARNING nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.187 229250 DEBUG nova.virt.libvirt.host [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.188 229250 DEBUG nova.virt.libvirt.host [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.193 229250 DEBUG nova.virt.libvirt.host [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.194 229250 DEBUG nova.virt.libvirt.host [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.195 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.196 229250 DEBUG nova.virt.hardware [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-06T10:04:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0a252b9c-cc5f-41b2-a8b2-94fcf6e74d22',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:04:42Z,direct_url=<?>,disk_format='qcow2',id=9489b8a5-a798-4e26-87f9-59bb1eb2e6fd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3e0ab101ca7547d4a515169a0f2edef3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-06T10:04:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.197 229250 DEBUG nova.virt.hardware [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.197 229250 DEBUG nova.virt.hardware [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.198 229250 DEBUG nova.virt.hardware [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.198 229250 DEBUG nova.virt.hardware [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.199 229250 DEBUG nova.virt.hardware [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.199 229250 DEBUG nova.virt.hardware [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.200 229250 DEBUG nova.virt.hardware [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.201 229250 DEBUG nova.virt.hardware [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.201 229250 DEBUG nova.virt.hardware [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.202 229250 DEBUG nova.virt.hardware [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  6 05:09:15 np0005548918 podman[234932]: 2025-12-06 10:09:15.207684769 +0000 UTC m=+0.093890754 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.208 229250 DEBUG oslo_concurrency.processutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:09:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:15 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:15 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  6 05:09:15 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3189126703' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.697 229250 DEBUG oslo_concurrency.processutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.723 229250 DEBUG nova.storage.rbd_utils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image f06564b9-c316-4167-8633-00e6af858c3b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:09:15 np0005548918 nova_compute[229246]: 2025-12-06 10:09:15.728 229250 DEBUG oslo_concurrency.processutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:09:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:16 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:16 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:16 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:16 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:16 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  6 05:09:16 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3347373695' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.153 229250 DEBUG oslo_concurrency.processutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.155 229250 DEBUG nova.virt.libvirt.vif [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T10:09:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-871726981',display_name='tempest-TestNetworkBasicOps-server-871726981',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-871726981',id=5,image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB+sK4MgGxvQh21RMm16O6qbl28Kk35BYm52LNfrd3i5H+a8fRktT63KTvuEOHdshPSZyIL9dsImnmOHaCdxSQ4qjdbr+bz0VxvsC5RwImbB6Rr/ZHQJw3TL6yY22cZzvQ==',key_name='tempest-TestNetworkBasicOps-1385547137',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='92b402c8d3e2476abc98be42a1e6d34e',ramdisk_id='',reservation_id='r-9pf33090',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1971100882',owner_user_name='tempest-TestNetworkBasicOps-1971100882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:09:10Z,user_data=None,user_id='03615580775245e6ae335ee9d785611f',uuid=f06564b9-c316-4167-8633-00e6af858c3b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "address": "fa:16:3e:85:82:b5", "network": {"id": "dccd9941-4f3e-4086-b9cd-651d8e99e8ec", "bridge": "br-int", "label": "tempest-network-smoke--1290241953", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdada49f9-c0", "ovs_interfaceid": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.155 229250 DEBUG nova.network.os_vif_util [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converting VIF {"id": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "address": "fa:16:3e:85:82:b5", "network": {"id": "dccd9941-4f3e-4086-b9cd-651d8e99e8ec", "bridge": "br-int", "label": "tempest-network-smoke--1290241953", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdada49f9-c0", "ovs_interfaceid": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.156 229250 DEBUG nova.network.os_vif_util [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:82:b5,bridge_name='br-int',has_traffic_filtering=True,id=dada49f9-c005-4fb8-845a-9a4e59c236ce,network=Network(dccd9941-4f3e-4086-b9cd-651d8e99e8ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdada49f9-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.157 229250 DEBUG nova.objects.instance [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lazy-loading 'pci_devices' on Instance uuid f06564b9-c316-4167-8633-00e6af858c3b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.170 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] End _get_guest_xml xml=<domain type="kvm">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  <uuid>f06564b9-c316-4167-8633-00e6af858c3b</uuid>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  <name>instance-00000005</name>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  <memory>131072</memory>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  <vcpu>1</vcpu>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  <metadata>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <nova:name>tempest-TestNetworkBasicOps-server-871726981</nova:name>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <nova:creationTime>2025-12-06 10:09:15</nova:creationTime>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <nova:flavor name="m1.nano">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <nova:memory>128</nova:memory>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <nova:disk>1</nova:disk>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <nova:swap>0</nova:swap>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <nova:ephemeral>0</nova:ephemeral>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <nova:vcpus>1</nova:vcpus>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      </nova:flavor>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <nova:owner>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <nova:user uuid="03615580775245e6ae335ee9d785611f">tempest-TestNetworkBasicOps-1971100882-project-member</nova:user>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <nova:project uuid="92b402c8d3e2476abc98be42a1e6d34e">tempest-TestNetworkBasicOps-1971100882</nova:project>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      </nova:owner>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <nova:root type="image" uuid="9489b8a5-a798-4e26-87f9-59bb1eb2e6fd"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <nova:ports>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <nova:port uuid="dada49f9-c005-4fb8-845a-9a4e59c236ce">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        </nova:port>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      </nova:ports>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    </nova:instance>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  </metadata>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  <sysinfo type="smbios">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <system>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <entry name="manufacturer">RDO</entry>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <entry name="product">OpenStack Compute</entry>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <entry name="serial">f06564b9-c316-4167-8633-00e6af858c3b</entry>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <entry name="uuid">f06564b9-c316-4167-8633-00e6af858c3b</entry>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <entry name="family">Virtual Machine</entry>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    </system>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  </sysinfo>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  <os>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <boot dev="hd"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <smbios mode="sysinfo"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  </os>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  <features>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <acpi/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <apic/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <vmcoreinfo/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  </features>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  <clock offset="utc">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <timer name="pit" tickpolicy="delay"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <timer name="hpet" present="no"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  </clock>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  <cpu mode="host-model" match="exact">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <topology sockets="1" cores="1" threads="1"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  </cpu>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  <devices>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <disk type="network" device="disk">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <driver type="raw" cache="none"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <source protocol="rbd" name="vms/f06564b9-c316-4167-8633-00e6af858c3b_disk">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <host name="192.168.122.100" port="6789"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <host name="192.168.122.102" port="6789"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <host name="192.168.122.101" port="6789"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      </source>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <auth username="openstack">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <secret type="ceph" uuid="5ecd3f74-dade-5fc4-92ce-8950ae424258"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      </auth>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <target dev="vda" bus="virtio"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    </disk>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <disk type="network" device="cdrom">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <driver type="raw" cache="none"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <source protocol="rbd" name="vms/f06564b9-c316-4167-8633-00e6af858c3b_disk.config">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <host name="192.168.122.100" port="6789"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <host name="192.168.122.102" port="6789"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <host name="192.168.122.101" port="6789"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      </source>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <auth username="openstack">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:        <secret type="ceph" uuid="5ecd3f74-dade-5fc4-92ce-8950ae424258"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      </auth>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <target dev="sda" bus="sata"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    </disk>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <interface type="ethernet">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <mac address="fa:16:3e:85:82:b5"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <model type="virtio"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <driver name="vhost" rx_queue_size="512"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <mtu size="1442"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <target dev="tapdada49f9-c0"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    </interface>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <serial type="pty">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <log file="/var/lib/nova/instances/f06564b9-c316-4167-8633-00e6af858c3b/console.log" append="off"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    </serial>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <video>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <model type="virtio"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    </video>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <input type="tablet" bus="usb"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <rng model="virtio">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <backend model="random">/dev/urandom</backend>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    </rng>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <controller type="usb" index="0"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    <memballoon model="virtio">
Dec  6 05:09:16 np0005548918 nova_compute[229246]:      <stats period="10"/>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:    </memballoon>
Dec  6 05:09:16 np0005548918 nova_compute[229246]:  </devices>
Dec  6 05:09:16 np0005548918 nova_compute[229246]: </domain>
Dec  6 05:09:16 np0005548918 nova_compute[229246]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.171 229250 DEBUG nova.compute.manager [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Preparing to wait for external event network-vif-plugged-dada49f9-c005-4fb8-845a-9a4e59c236ce prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.172 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "f06564b9-c316-4167-8633-00e6af858c3b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.172 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.172 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.173 229250 DEBUG nova.virt.libvirt.vif [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T10:09:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-871726981',display_name='tempest-TestNetworkBasicOps-server-871726981',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-871726981',id=5,image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB+sK4MgGxvQh21RMm16O6qbl28Kk35BYm52LNfrd3i5H+a8fRktT63KTvuEOHdshPSZyIL9dsImnmOHaCdxSQ4qjdbr+bz0VxvsC5RwImbB6Rr/ZHQJw3TL6yY22cZzvQ==',key_name='tempest-TestNetworkBasicOps-1385547137',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='92b402c8d3e2476abc98be42a1e6d34e',ramdisk_id='',reservation_id='r-9pf33090',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1971100882',owner_user_name='tempest-TestNetworkBasicOps-1971100882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:09:10Z,user_data=None,user_id='03615580775245e6ae335ee9d785611f',uuid=f06564b9-c316-4167-8633-00e6af858c3b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "address": "fa:16:3e:85:82:b5", "network": {"id": "dccd9941-4f3e-4086-b9cd-651d8e99e8ec", "bridge": "br-int", "label": "tempest-network-smoke--1290241953", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdada49f9-c0", "ovs_interfaceid": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.173 229250 DEBUG nova.network.os_vif_util [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converting VIF {"id": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "address": "fa:16:3e:85:82:b5", "network": {"id": "dccd9941-4f3e-4086-b9cd-651d8e99e8ec", "bridge": "br-int", "label": "tempest-network-smoke--1290241953", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdada49f9-c0", "ovs_interfaceid": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.173 229250 DEBUG nova.network.os_vif_util [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:82:b5,bridge_name='br-int',has_traffic_filtering=True,id=dada49f9-c005-4fb8-845a-9a4e59c236ce,network=Network(dccd9941-4f3e-4086-b9cd-651d8e99e8ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdada49f9-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.174 229250 DEBUG os_vif [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:82:b5,bridge_name='br-int',has_traffic_filtering=True,id=dada49f9-c005-4fb8-845a-9a4e59c236ce,network=Network(dccd9941-4f3e-4086-b9cd-651d8e99e8ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdada49f9-c0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.174 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.175 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.175 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.178 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.179 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdada49f9-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.179 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdada49f9-c0, col_values=(('external_ids', {'iface-id': 'dada49f9-c005-4fb8-845a-9a4e59c236ce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:85:82:b5', 'vm-uuid': 'f06564b9-c316-4167-8633-00e6af858c3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:09:16 np0005548918 NetworkManager[48884]: <info>  [1765015756.1820] manager: (tapdada49f9-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.182 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.186 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.187 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.188 229250 INFO os_vif [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:82:b5,bridge_name='br-int',has_traffic_filtering=True,id=dada49f9-c005-4fb8-845a-9a4e59c236ce,network=Network(dccd9941-4f3e-4086-b9cd-651d8e99e8ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdada49f9-c0')#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.226 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.226 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.227 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] No VIF found with MAC fa:16:3e:85:82:b5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.227 229250 INFO nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Using config drive#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.256 229250 DEBUG nova.storage.rbd_utils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image f06564b9-c316-4167-8633-00e6af858c3b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:09:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.577 229250 INFO nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Creating config drive at /var/lib/nova/instances/f06564b9-c316-4167-8633-00e6af858c3b/disk.config#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.582 229250 DEBUG oslo_concurrency.processutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f06564b9-c316-4167-8633-00e6af858c3b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpurzeob8z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.627 229250 DEBUG nova.network.neutron [req-ab9fa797-c337-4f0a-8da0-d5825e084d06 req-cdced5cd-d530-4244-be47-e86cfafdd0e4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Updated VIF entry in instance network info cache for port dada49f9-c005-4fb8-845a-9a4e59c236ce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.627 229250 DEBUG nova.network.neutron [req-ab9fa797-c337-4f0a-8da0-d5825e084d06 req-cdced5cd-d530-4244-be47-e86cfafdd0e4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Updating instance_info_cache with network_info: [{"id": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "address": "fa:16:3e:85:82:b5", "network": {"id": "dccd9941-4f3e-4086-b9cd-651d8e99e8ec", "bridge": "br-int", "label": "tempest-network-smoke--1290241953", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdada49f9-c0", "ovs_interfaceid": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.642 229250 DEBUG oslo_concurrency.lockutils [req-ab9fa797-c337-4f0a-8da0-d5825e084d06 req-cdced5cd-d530-4244-be47-e86cfafdd0e4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Releasing lock "refresh_cache-f06564b9-c316-4167-8633-00e6af858c3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:09:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:16.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:16.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.723 229250 DEBUG oslo_concurrency.processutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f06564b9-c316-4167-8633-00e6af858c3b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpurzeob8z" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.760 229250 DEBUG nova.storage.rbd_utils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image f06564b9-c316-4167-8633-00e6af858c3b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.765 229250 DEBUG oslo_concurrency.processutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f06564b9-c316-4167-8633-00e6af858c3b/disk.config f06564b9-c316-4167-8633-00e6af858c3b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:09:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:16 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.946 229250 DEBUG oslo_concurrency.processutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f06564b9-c316-4167-8633-00e6af858c3b/disk.config f06564b9-c316-4167-8633-00e6af858c3b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.181s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:09:16 np0005548918 nova_compute[229246]: 2025-12-06 10:09:16.947 229250 INFO nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Deleting local config drive /var/lib/nova/instances/f06564b9-c316-4167-8633-00e6af858c3b/disk.config because it was imported into RBD.#033[00m
Dec  6 05:09:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:16 np0005548918 systemd[1]: Starting libvirt secret daemon...
Dec  6 05:09:17 np0005548918 systemd[1]: Started libvirt secret daemon.
Dec  6 05:09:17 np0005548918 kernel: tapdada49f9-c0: entered promiscuous mode
Dec  6 05:09:17 np0005548918 NetworkManager[48884]: <info>  [1765015757.0573] manager: (tapdada49f9-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Dec  6 05:09:17 np0005548918 ovn_controller[132371]: 2025-12-06T10:09:17Z|00043|binding|INFO|Claiming lport dada49f9-c005-4fb8-845a-9a4e59c236ce for this chassis.
Dec  6 05:09:17 np0005548918 ovn_controller[132371]: 2025-12-06T10:09:17Z|00044|binding|INFO|dada49f9-c005-4fb8-845a-9a4e59c236ce: Claiming fa:16:3e:85:82:b5 10.100.0.8
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.057 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.076 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:82:b5 10.100.0.8'], port_security=['fa:16:3e:85:82:b5 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f06564b9-c316-4167-8633-00e6af858c3b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dccd9941-4f3e-4086-b9cd-651d8e99e8ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92b402c8d3e2476abc98be42a1e6d34e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8746e50b-7b98-4682-a6b3-e2469105e4b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=611cd505-2a02-4d45-a906-bd97d1447953, chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], logical_port=dada49f9-c005-4fb8-845a-9a4e59c236ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.078 141640 INFO neutron.agent.ovn.metadata.agent [-] Port dada49f9-c005-4fb8-845a-9a4e59c236ce in datapath dccd9941-4f3e-4086-b9cd-651d8e99e8ec bound to our chassis#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.081 141640 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dccd9941-4f3e-4086-b9cd-651d8e99e8ec#033[00m
Dec  6 05:09:17 np0005548918 systemd-machined[192688]: New machine qemu-2-instance-00000005.
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.093 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[2c75ba07-660d-4199-963d-52df73680ef8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.094 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdccd9941-41 in ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.096 233203 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdccd9941-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.096 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[c50aaa27-a735-455d-9885-73bf7e6df25a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.096 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[aa02ff9a-1570-459e-b33b-df9157a33c6c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 systemd-udevd[235187]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 05:09:17 np0005548918 NetworkManager[48884]: <info>  [1765015757.1139] device (tapdada49f9-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.112 141754 DEBUG oslo.privsep.daemon [-] privsep: reply[e194836c-d7b4-4b0c-ac5d-8846e9a69b9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 NetworkManager[48884]: <info>  [1765015757.1148] device (tapdada49f9-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  6 05:09:17 np0005548918 systemd[1]: Started Virtual Machine qemu-2-instance-00000005.
Dec  6 05:09:17 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:09:17 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:17 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:17 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.145 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[b5055796-ef82-47b6-8e97-840438688dfc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.162 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:17 np0005548918 ovn_controller[132371]: 2025-12-06T10:09:17Z|00045|binding|INFO|Setting lport dada49f9-c005-4fb8-845a-9a4e59c236ce ovn-installed in OVS
Dec  6 05:09:17 np0005548918 ovn_controller[132371]: 2025-12-06T10:09:17Z|00046|binding|INFO|Setting lport dada49f9-c005-4fb8-845a-9a4e59c236ce up in Southbound
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.168 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.173 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[ce0c18fb-5afa-4742-8539-dbb2911bbfb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.177 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[42cfa4c7-bc5a-4b7c-8b9d-eb9b0c001ded]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 NetworkManager[48884]: <info>  [1765015757.1791] manager: (tapdccd9941-40): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.207 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[e4df08ce-97b2-4529-a448-b2c23ebf6fee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.209 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[ab2d2d73-2488-4d14-b282-bfae4c68232d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 NetworkManager[48884]: <info>  [1765015757.2345] device (tapdccd9941-40): carrier: link connected
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.239 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[18a04ab3-ab68-4252-b686-ab8b7266bbdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.258 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[b705477d-bee6-45f9-b427-4fe9260a87c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdccd9941-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:b1:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413083, 'reachable_time': 37330, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235220, 'error': None, 'target': 'ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.273 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[c98a907c-c817-4f98-8c8c-4bb7f442e038]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:b1b9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413083, 'tstamp': 413083}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235221, 'error': None, 'target': 'ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.293 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[9c69cc1d-28a1-4447-90d0-15321aaa4b7f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdccd9941-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:b1:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413083, 'reachable_time': 37330, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235229, 'error': None, 'target': 'ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.325 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[50331e92-4404-4d8b-b964-cbbbc2a3116c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.373 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[dce27d10-44af-4cfe-a521-0fad637e9c03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.375 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdccd9941-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.375 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.375 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdccd9941-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:09:17 np0005548918 kernel: tapdccd9941-40: entered promiscuous mode
Dec  6 05:09:17 np0005548918 NetworkManager[48884]: <info>  [1765015757.3783] manager: (tapdccd9941-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.377 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.379 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.380 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdccd9941-40, col_values=(('external_ids', {'iface-id': '5c84c258-875b-4b17-864b-0a3a247ec558'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.381 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:17 np0005548918 ovn_controller[132371]: 2025-12-06T10:09:17Z|00047|binding|INFO|Releasing lport 5c84c258-875b-4b17-864b-0a3a247ec558 from this chassis (sb_readonly=0)
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.393 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.394 141640 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dccd9941-4f3e-4086-b9cd-651d8e99e8ec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dccd9941-4f3e-4086-b9cd-651d8e99e8ec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.395 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[81241d0f-9af0-4ba1-a431-a9b5bdc042cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.396 141640 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: global
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    log         /dev/log local0 debug
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    log-tag     haproxy-metadata-proxy-dccd9941-4f3e-4086-b9cd-651d8e99e8ec
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    user        root
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    group       root
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    maxconn     1024
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    pidfile     /var/lib/neutron/external/pids/dccd9941-4f3e-4086-b9cd-651d8e99e8ec.pid.haproxy
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    daemon
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: defaults
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    log global
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    mode http
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    option httplog
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    option dontlognull
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    option http-server-close
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    option forwardfor
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    retries                 3
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    timeout http-request    30s
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    timeout connect         30s
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    timeout client          32s
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    timeout server          32s
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    timeout http-keep-alive 30s
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: listen listener
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    bind 169.254.169.254:80
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    server metadata /var/lib/neutron/metadata_proxy
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]:    http-request add-header X-OVN-Network-ID dccd9941-4f3e-4086-b9cd-651d8e99e8ec
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.397 141640 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec', 'env', 'PROCESS_TAG=haproxy-dccd9941-4f3e-4086-b9cd-651d8e99e8ec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dccd9941-4f3e-4086-b9cd-651d8e99e8ec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.453 229250 DEBUG nova.virt.driver [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Emitting event <LifecycleEvent: 1765015757.4525528, f06564b9-c316-4167-8633-00e6af858c3b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.454 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: f06564b9-c316-4167-8633-00e6af858c3b] VM Started (Lifecycle Event)#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.484 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.488 229250 DEBUG nova.virt.driver [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Emitting event <LifecycleEvent: 1765015757.452791, f06564b9-c316-4167-8633-00e6af858c3b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.488 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: f06564b9-c316-4167-8633-00e6af858c3b] VM Paused (Lifecycle Event)#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.557 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.559 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  6 05:09:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:17 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:17 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.584 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: f06564b9-c316-4167-8633-00e6af858c3b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.669 229250 DEBUG nova.compute.manager [req-96cfd94a-d76c-451e-bf54-f8f1fc96323b req-f1b446f1-3374-497b-9316-a91c1fad7712 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Received event network-vif-plugged-dada49f9-c005-4fb8-845a-9a4e59c236ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.670 229250 DEBUG oslo_concurrency.lockutils [req-96cfd94a-d76c-451e-bf54-f8f1fc96323b req-f1b446f1-3374-497b-9316-a91c1fad7712 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "f06564b9-c316-4167-8633-00e6af858c3b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.670 229250 DEBUG oslo_concurrency.lockutils [req-96cfd94a-d76c-451e-bf54-f8f1fc96323b req-f1b446f1-3374-497b-9316-a91c1fad7712 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.670 229250 DEBUG oslo_concurrency.lockutils [req-96cfd94a-d76c-451e-bf54-f8f1fc96323b req-f1b446f1-3374-497b-9316-a91c1fad7712 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.671 229250 DEBUG nova.compute.manager [req-96cfd94a-d76c-451e-bf54-f8f1fc96323b req-f1b446f1-3374-497b-9316-a91c1fad7712 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Processing event network-vif-plugged-dada49f9-c005-4fb8-845a-9a4e59c236ce _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.671 229250 DEBUG nova.compute.manager [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.674 229250 DEBUG nova.virt.driver [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Emitting event <LifecycleEvent: 1765015757.6740894, f06564b9-c316-4167-8633-00e6af858c3b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.675 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: f06564b9-c316-4167-8633-00e6af858c3b] VM Resumed (Lifecycle Event)#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.676 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.679 229250 INFO nova.virt.libvirt.driver [-] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Instance spawned successfully.#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.680 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.694 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.699 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.702 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.702 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.703 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.703 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.704 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.704 229250 DEBUG nova.virt.libvirt.driver [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.721 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: f06564b9-c316-4167-8633-00e6af858c3b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  6 05:09:17 np0005548918 podman[235297]: 2025-12-06 10:09:17.772362449 +0000 UTC m=+0.049250244 container create c0e5f2e62dc181e97dddbd5e7c65774c9fea77a8679763455d839454b7c6310d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.789 229250 INFO nova.compute.manager [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Took 7.30 seconds to spawn the instance on the hypervisor.#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.790 229250 DEBUG nova.compute.manager [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.806 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.806 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:09:17 np0005548918 systemd[1]: Started libpod-conmon-c0e5f2e62dc181e97dddbd5e7c65774c9fea77a8679763455d839454b7c6310d.scope.
Dec  6 05:09:17 np0005548918 podman[235297]: 2025-12-06 10:09:17.745659297 +0000 UTC m=+0.022547122 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.847 229250 INFO nova.compute.manager [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Took 8.28 seconds to build instance.#033[00m
Dec  6 05:09:17 np0005548918 systemd[1]: Started libcrun container.
Dec  6 05:09:17 np0005548918 nova_compute[229246]: 2025-12-06 10:09:17.860 229250 DEBUG oslo_concurrency.lockutils [None req-1fa84641-0343-47c5-8922-52061065df15 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:17 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93bd849e348fa7759c507305b5690eef5ac1c3c28245ed2324af3343d04b9117/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  6 05:09:17 np0005548918 podman[235297]: 2025-12-06 10:09:17.877411733 +0000 UTC m=+0.154299558 container init c0e5f2e62dc181e97dddbd5e7c65774c9fea77a8679763455d839454b7c6310d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  6 05:09:17 np0005548918 podman[235297]: 2025-12-06 10:09:17.885530613 +0000 UTC m=+0.162418418 container start c0e5f2e62dc181e97dddbd5e7c65774c9fea77a8679763455d839454b7c6310d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  6 05:09:17 np0005548918 neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec[235312]: [NOTICE]   (235316) : New worker (235318) forked
Dec  6 05:09:17 np0005548918 neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec[235312]: [NOTICE]   (235316) : Loading success.
Dec  6 05:09:17 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:17.931 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:09:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:09:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:18.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:09:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:09:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:18.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:09:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:18 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:19 np0005548918 nova_compute[229246]: 2025-12-06 10:09:19.268 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:19 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:19 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:19 np0005548918 nova_compute[229246]: 2025-12-06 10:09:19.750 229250 DEBUG nova.compute.manager [req-fbce536a-6dcd-462e-b3a9-bd503961fafa req-fc9ee0b5-23c6-43f9-8098-ad54ed0410ea d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Received event network-vif-plugged-dada49f9-c005-4fb8-845a-9a4e59c236ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:09:19 np0005548918 nova_compute[229246]: 2025-12-06 10:09:19.751 229250 DEBUG oslo_concurrency.lockutils [req-fbce536a-6dcd-462e-b3a9-bd503961fafa req-fc9ee0b5-23c6-43f9-8098-ad54ed0410ea d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "f06564b9-c316-4167-8633-00e6af858c3b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:19 np0005548918 nova_compute[229246]: 2025-12-06 10:09:19.752 229250 DEBUG oslo_concurrency.lockutils [req-fbce536a-6dcd-462e-b3a9-bd503961fafa req-fc9ee0b5-23c6-43f9-8098-ad54ed0410ea d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:19 np0005548918 nova_compute[229246]: 2025-12-06 10:09:19.752 229250 DEBUG oslo_concurrency.lockutils [req-fbce536a-6dcd-462e-b3a9-bd503961fafa req-fc9ee0b5-23c6-43f9-8098-ad54ed0410ea d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:19 np0005548918 nova_compute[229246]: 2025-12-06 10:09:19.753 229250 DEBUG nova.compute.manager [req-fbce536a-6dcd-462e-b3a9-bd503961fafa req-fc9ee0b5-23c6-43f9-8098-ad54ed0410ea d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] No waiting events found dispatching network-vif-plugged-dada49f9-c005-4fb8-845a-9a4e59c236ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:09:19 np0005548918 nova_compute[229246]: 2025-12-06 10:09:19.753 229250 WARNING nova.compute.manager [req-fbce536a-6dcd-462e-b3a9-bd503961fafa req-fc9ee0b5-23c6-43f9-8098-ad54ed0410ea d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Received unexpected event network-vif-plugged-dada49f9-c005-4fb8-845a-9a4e59c236ce for instance with vm_state active and task_state None.#033[00m
Dec  6 05:09:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:20.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:20.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:20 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:21 np0005548918 podman[235330]: 2025-12-06 10:09:21.179614171 +0000 UTC m=+0.067147209 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd)
Dec  6 05:09:21 np0005548918 nova_compute[229246]: 2025-12-06 10:09:21.182 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:21 np0005548918 NetworkManager[48884]: <info>  [1765015761.2251] manager: (patch-br-int-to-provnet-c81e973e-7ff9-4cd2-9994-daf87649321f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Dec  6 05:09:21 np0005548918 NetworkManager[48884]: <info>  [1765015761.2263] manager: (patch-provnet-c81e973e-7ff9-4cd2-9994-daf87649321f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Dec  6 05:09:21 np0005548918 ovn_controller[132371]: 2025-12-06T10:09:21Z|00048|binding|INFO|Releasing lport 5c84c258-875b-4b17-864b-0a3a247ec558 from this chassis (sb_readonly=0)
Dec  6 05:09:21 np0005548918 nova_compute[229246]: 2025-12-06 10:09:21.240 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:21 np0005548918 ovn_controller[132371]: 2025-12-06T10:09:21Z|00049|binding|INFO|Releasing lport 5c84c258-875b-4b17-864b-0a3a247ec558 from this chassis (sb_readonly=0)
Dec  6 05:09:21 np0005548918 nova_compute[229246]: 2025-12-06 10:09:21.260 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:21 np0005548918 nova_compute[229246]: 2025-12-06 10:09:21.265 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:21 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:21 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:22.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:22.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:22 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:23 np0005548918 nova_compute[229246]: 2025-12-06 10:09:23.058 229250 DEBUG nova.compute.manager [req-1b6b2636-6d28-4734-99f3-b2de9ef2a54a req-3f18f03e-af47-4915-95b0-2f07efb7a0b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Received event network-changed-dada49f9-c005-4fb8-845a-9a4e59c236ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:09:23 np0005548918 nova_compute[229246]: 2025-12-06 10:09:23.059 229250 DEBUG nova.compute.manager [req-1b6b2636-6d28-4734-99f3-b2de9ef2a54a req-3f18f03e-af47-4915-95b0-2f07efb7a0b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Refreshing instance network info cache due to event network-changed-dada49f9-c005-4fb8-845a-9a4e59c236ce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  6 05:09:23 np0005548918 nova_compute[229246]: 2025-12-06 10:09:23.059 229250 DEBUG oslo_concurrency.lockutils [req-1b6b2636-6d28-4734-99f3-b2de9ef2a54a req-3f18f03e-af47-4915-95b0-2f07efb7a0b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "refresh_cache-f06564b9-c316-4167-8633-00e6af858c3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:09:23 np0005548918 nova_compute[229246]: 2025-12-06 10:09:23.060 229250 DEBUG oslo_concurrency.lockutils [req-1b6b2636-6d28-4734-99f3-b2de9ef2a54a req-3f18f03e-af47-4915-95b0-2f07efb7a0b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquired lock "refresh_cache-f06564b9-c316-4167-8633-00e6af858c3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:09:23 np0005548918 nova_compute[229246]: 2025-12-06 10:09:23.060 229250 DEBUG nova.network.neutron [req-1b6b2636-6d28-4734-99f3-b2de9ef2a54a req-3f18f03e-af47-4915-95b0-2f07efb7a0b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Refreshing network info cache for port dada49f9-c005-4fb8-845a-9a4e59c236ce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  6 05:09:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:23 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3900091b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:23 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:23 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:23 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:09:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:24 np0005548918 nova_compute[229246]: 2025-12-06 10:09:24.272 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:24 np0005548918 nova_compute[229246]: 2025-12-06 10:09:24.545 229250 DEBUG nova.network.neutron [req-1b6b2636-6d28-4734-99f3-b2de9ef2a54a req-3f18f03e-af47-4915-95b0-2f07efb7a0b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Updated VIF entry in instance network info cache for port dada49f9-c005-4fb8-845a-9a4e59c236ce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  6 05:09:24 np0005548918 nova_compute[229246]: 2025-12-06 10:09:24.545 229250 DEBUG nova.network.neutron [req-1b6b2636-6d28-4734-99f3-b2de9ef2a54a req-3f18f03e-af47-4915-95b0-2f07efb7a0b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Updating instance_info_cache with network_info: [{"id": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "address": "fa:16:3e:85:82:b5", "network": {"id": "dccd9941-4f3e-4086-b9cd-651d8e99e8ec", "bridge": "br-int", "label": "tempest-network-smoke--1290241953", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdada49f9-c0", "ovs_interfaceid": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:09:24 np0005548918 nova_compute[229246]: 2025-12-06 10:09:24.584 229250 DEBUG oslo_concurrency.lockutils [req-1b6b2636-6d28-4734-99f3-b2de9ef2a54a req-3f18f03e-af47-4915-95b0-2f07efb7a0b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Releasing lock "refresh_cache-f06564b9-c316-4167-8633-00e6af858c3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:09:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:24.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:24.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:24 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:25 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:25 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3900091b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:25 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:25.934 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:09:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:26 np0005548918 nova_compute[229246]: 2025-12-06 10:09:26.187 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:26.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:26.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:26 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:27 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:27 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:09:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:28.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:09:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:28.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:28 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:29 np0005548918 nova_compute[229246]: 2025-12-06 10:09:29.309 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:29 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:29 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:30.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:09:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:30.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:09:30 np0005548918 ovn_controller[132371]: 2025-12-06T10:09:30Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:85:82:b5 10.100.0.8
Dec  6 05:09:30 np0005548918 ovn_controller[132371]: 2025-12-06T10:09:30Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:85:82:b5 10.100.0.8
Dec  6 05:09:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:30 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:31 np0005548918 nova_compute[229246]: 2025-12-06 10:09:31.191 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:31 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:31 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:32.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:09:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:32.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:09:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:32 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:33 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:33 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:34 np0005548918 nova_compute[229246]: 2025-12-06 10:09:34.311 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:34.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:34.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:34 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:35 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:35 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:36 np0005548918 nova_compute[229246]: 2025-12-06 10:09:36.193 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:36.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:36.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:36 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:37 np0005548918 nova_compute[229246]: 2025-12-06 10:09:37.457 229250 INFO nova.compute.manager [None req-35c74861-6863-442a-a004-ee6eb31aea65 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Get console output#033[00m
Dec  6 05:09:37 np0005548918 nova_compute[229246]: 2025-12-06 10:09:37.463 229250 INFO oslo.privsep.daemon [None req-35c74861-6863-442a-a004-ee6eb31aea65 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp27gqt_ng/privsep.sock']#033[00m
Dec  6 05:09:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:37 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:37 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.163 229250 INFO oslo.privsep.daemon [None req-35c74861-6863-442a-a004-ee6eb31aea65 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.047 235422 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.054 235422 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.058 235422 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.059 235422 INFO oslo.privsep.daemon [-] privsep daemon running as pid 235422#033[00m
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.251 235422 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec  6 05:09:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:38.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:38.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.806 229250 DEBUG oslo_concurrency.lockutils [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "f06564b9-c316-4167-8633-00e6af858c3b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.806 229250 DEBUG oslo_concurrency.lockutils [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.807 229250 DEBUG oslo_concurrency.lockutils [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "f06564b9-c316-4167-8633-00e6af858c3b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.807 229250 DEBUG oslo_concurrency.lockutils [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.807 229250 DEBUG oslo_concurrency.lockutils [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.809 229250 INFO nova.compute.manager [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Terminating instance#033[00m
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.811 229250 DEBUG nova.compute.manager [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  6 05:09:38 np0005548918 kernel: tapdada49f9-c0 (unregistering): left promiscuous mode
Dec  6 05:09:38 np0005548918 NetworkManager[48884]: <info>  [1765015778.8690] device (tapdada49f9-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  6 05:09:38 np0005548918 ovn_controller[132371]: 2025-12-06T10:09:38Z|00050|binding|INFO|Releasing lport dada49f9-c005-4fb8-845a-9a4e59c236ce from this chassis (sb_readonly=0)
Dec  6 05:09:38 np0005548918 ovn_controller[132371]: 2025-12-06T10:09:38Z|00051|binding|INFO|Setting lport dada49f9-c005-4fb8-845a-9a4e59c236ce down in Southbound
Dec  6 05:09:38 np0005548918 ovn_controller[132371]: 2025-12-06T10:09:38Z|00052|binding|INFO|Removing iface tapdada49f9-c0 ovn-installed in OVS
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.881 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:38 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:38.893 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:82:b5 10.100.0.8'], port_security=['fa:16:3e:85:82:b5 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f06564b9-c316-4167-8633-00e6af858c3b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dccd9941-4f3e-4086-b9cd-651d8e99e8ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92b402c8d3e2476abc98be42a1e6d34e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8746e50b-7b98-4682-a6b3-e2469105e4b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=611cd505-2a02-4d45-a906-bd97d1447953, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], logical_port=dada49f9-c005-4fb8-845a-9a4e59c236ce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:09:38 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:38.894 141640 INFO neutron.agent.ovn.metadata.agent [-] Port dada49f9-c005-4fb8-845a-9a4e59c236ce in datapath dccd9941-4f3e-4086-b9cd-651d8e99e8ec unbound from our chassis#033[00m
Dec  6 05:09:38 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:38.895 141640 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dccd9941-4f3e-4086-b9cd-651d8e99e8ec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  6 05:09:38 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:38.896 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[38ee3470-3e74-46c5-874c-48818ba3c039]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:38 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:38.897 141640 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec namespace which is not needed anymore#033[00m
Dec  6 05:09:38 np0005548918 nova_compute[229246]: 2025-12-06 10:09:38.901 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:38 np0005548918 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Deactivated successfully.
Dec  6 05:09:38 np0005548918 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Consumed 13.307s CPU time.
Dec  6 05:09:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:38 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:38 np0005548918 systemd-machined[192688]: Machine qemu-2-instance-00000005 terminated.
Dec  6 05:09:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:39 np0005548918 podman[235425]: 2025-12-06 10:09:39.002616255 +0000 UTC m=+0.108651012 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller)
Dec  6 05:09:39 np0005548918 neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec[235312]: [NOTICE]   (235316) : haproxy version is 2.8.14-c23fe91
Dec  6 05:09:39 np0005548918 neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec[235312]: [NOTICE]   (235316) : path to executable is /usr/sbin/haproxy
Dec  6 05:09:39 np0005548918 neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec[235312]: [WARNING]  (235316) : Exiting Master process...
Dec  6 05:09:39 np0005548918 neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec[235312]: [ALERT]    (235316) : Current worker (235318) exited with code 143 (Terminated)
Dec  6 05:09:39 np0005548918 neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec[235312]: [WARNING]  (235316) : All workers exited. Exiting... (0)
Dec  6 05:09:39 np0005548918 systemd[1]: libpod-c0e5f2e62dc181e97dddbd5e7c65774c9fea77a8679763455d839454b7c6310d.scope: Deactivated successfully.
Dec  6 05:09:39 np0005548918 podman[235477]: 2025-12-06 10:09:39.032762161 +0000 UTC m=+0.049964753 container died c0e5f2e62dc181e97dddbd5e7c65774c9fea77a8679763455d839454b7c6310d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.055 229250 INFO nova.virt.libvirt.driver [-] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Instance destroyed successfully.#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.058 229250 DEBUG nova.objects.instance [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lazy-loading 'resources' on Instance uuid f06564b9-c316-4167-8633-00e6af858c3b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  6 05:09:39 np0005548918 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c0e5f2e62dc181e97dddbd5e7c65774c9fea77a8679763455d839454b7c6310d-userdata-shm.mount: Deactivated successfully.
Dec  6 05:09:39 np0005548918 systemd[1]: var-lib-containers-storage-overlay-93bd849e348fa7759c507305b5690eef5ac1c3c28245ed2324af3343d04b9117-merged.mount: Deactivated successfully.
Dec  6 05:09:39 np0005548918 podman[235477]: 2025-12-06 10:09:39.077258826 +0000 UTC m=+0.094461408 container cleanup c0e5f2e62dc181e97dddbd5e7c65774c9fea77a8679763455d839454b7c6310d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:09:39 np0005548918 systemd[1]: libpod-conmon-c0e5f2e62dc181e97dddbd5e7c65774c9fea77a8679763455d839454b7c6310d.scope: Deactivated successfully.
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.096 229250 DEBUG nova.virt.libvirt.vif [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-06T10:09:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-871726981',display_name='tempest-TestNetworkBasicOps-server-871726981',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-871726981',id=5,image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB+sK4MgGxvQh21RMm16O6qbl28Kk35BYm52LNfrd3i5H+a8fRktT63KTvuEOHdshPSZyIL9dsImnmOHaCdxSQ4qjdbr+bz0VxvsC5RwImbB6Rr/ZHQJw3TL6yY22cZzvQ==',key_name='tempest-TestNetworkBasicOps-1385547137',keypairs=<?>,launch_index=0,launched_at=2025-12-06T10:09:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='92b402c8d3e2476abc98be42a1e6d34e',ramdisk_id='',reservation_id='r-9pf33090',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1971100882',owner_user_name='tempest-TestNetworkBasicOps-1971100882-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T10:09:17Z,user_data=None,user_id='03615580775245e6ae335ee9d785611f',uuid=f06564b9-c316-4167-8633-00e6af858c3b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "address": "fa:16:3e:85:82:b5", "network": {"id": "dccd9941-4f3e-4086-b9cd-651d8e99e8ec", "bridge": "br-int", "label": "tempest-network-smoke--1290241953", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdada49f9-c0", "ovs_interfaceid": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.097 229250 DEBUG nova.network.os_vif_util [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converting VIF {"id": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "address": "fa:16:3e:85:82:b5", "network": {"id": "dccd9941-4f3e-4086-b9cd-651d8e99e8ec", "bridge": "br-int", "label": "tempest-network-smoke--1290241953", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdada49f9-c0", "ovs_interfaceid": "dada49f9-c005-4fb8-845a-9a4e59c236ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.098 229250 DEBUG nova.network.os_vif_util [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:85:82:b5,bridge_name='br-int',has_traffic_filtering=True,id=dada49f9-c005-4fb8-845a-9a4e59c236ce,network=Network(dccd9941-4f3e-4086-b9cd-651d8e99e8ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdada49f9-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.098 229250 DEBUG os_vif [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:82:b5,bridge_name='br-int',has_traffic_filtering=True,id=dada49f9-c005-4fb8-845a-9a4e59c236ce,network=Network(dccd9941-4f3e-4086-b9cd-651d8e99e8ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdada49f9-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.100 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.100 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdada49f9-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.101 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.103 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.106 229250 INFO os_vif [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:82:b5,bridge_name='br-int',has_traffic_filtering=True,id=dada49f9-c005-4fb8-845a-9a4e59c236ce,network=Network(dccd9941-4f3e-4086-b9cd-651d8e99e8ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdada49f9-c0')#033[00m
Dec  6 05:09:39 np0005548918 podman[235520]: 2025-12-06 10:09:39.14060626 +0000 UTC m=+0.043219190 container remove c0e5f2e62dc181e97dddbd5e7c65774c9fea77a8679763455d839454b7c6310d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:09:39 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:39.145 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[5e6f289b-0cbb-4308-9ad9-c58de58c4704]: (4, ('Sat Dec  6 10:09:38 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec (c0e5f2e62dc181e97dddbd5e7c65774c9fea77a8679763455d839454b7c6310d)\nc0e5f2e62dc181e97dddbd5e7c65774c9fea77a8679763455d839454b7c6310d\nSat Dec  6 10:09:39 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec (c0e5f2e62dc181e97dddbd5e7c65774c9fea77a8679763455d839454b7c6310d)\nc0e5f2e62dc181e97dddbd5e7c65774c9fea77a8679763455d839454b7c6310d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:39 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:39.147 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[7b536551-5f0a-4d8c-91d5-ca0f5f0b607b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:39 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:39.148 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdccd9941-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:09:39 np0005548918 kernel: tapdccd9941-40: left promiscuous mode
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.150 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.162 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:39 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:39.164 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[ba050721-d06b-45b3-89e6-33644ef4e9fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:39 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:39.181 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[615219d8-c645-41a3-9035-e44ea55e51e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:39 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:39.183 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[5f6f4679-7fb3-48c5-8107-b022aefb4f6d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:39 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:39.196 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[e9ebe721-8683-4777-a266-0ac7ac6151ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413077, 'reachable_time': 37830, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235551, 'error': None, 'target': 'ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:39 np0005548918 systemd[1]: run-netns-ovnmeta\x2ddccd9941\x2d4f3e\x2d4086\x2db9cd\x2d651d8e99e8ec.mount: Deactivated successfully.
Dec  6 05:09:39 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:39.206 141754 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dccd9941-4f3e-4086-b9cd-651d8e99e8ec deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  6 05:09:39 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:39.206 141754 DEBUG oslo.privsep.daemon [-] privsep: reply[dcd445e4-c750-4a67-90f8-ff1efb5b1371]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:09:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.347 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.456 229250 DEBUG nova.compute.manager [req-7da1377b-eb86-4b3f-86c2-2c843ce0e1b1 req-a537a5ba-973b-41ce-b19d-0c92fcbbd7f7 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Received event network-vif-unplugged-dada49f9-c005-4fb8-845a-9a4e59c236ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.457 229250 DEBUG oslo_concurrency.lockutils [req-7da1377b-eb86-4b3f-86c2-2c843ce0e1b1 req-a537a5ba-973b-41ce-b19d-0c92fcbbd7f7 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "f06564b9-c316-4167-8633-00e6af858c3b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.457 229250 DEBUG oslo_concurrency.lockutils [req-7da1377b-eb86-4b3f-86c2-2c843ce0e1b1 req-a537a5ba-973b-41ce-b19d-0c92fcbbd7f7 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.457 229250 DEBUG oslo_concurrency.lockutils [req-7da1377b-eb86-4b3f-86c2-2c843ce0e1b1 req-a537a5ba-973b-41ce-b19d-0c92fcbbd7f7 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.458 229250 DEBUG nova.compute.manager [req-7da1377b-eb86-4b3f-86c2-2c843ce0e1b1 req-a537a5ba-973b-41ce-b19d-0c92fcbbd7f7 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] No waiting events found dispatching network-vif-unplugged-dada49f9-c005-4fb8-845a-9a4e59c236ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:09:39 np0005548918 nova_compute[229246]: 2025-12-06 10:09:39.458 229250 DEBUG nova.compute.manager [req-7da1377b-eb86-4b3f-86c2-2c843ce0e1b1 req-a537a5ba-973b-41ce-b19d-0c92fcbbd7f7 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Received event network-vif-unplugged-dada49f9-c005-4fb8-845a-9a4e59c236ce for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  6 05:09:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:39 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:39 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:40 np0005548918 nova_compute[229246]: 2025-12-06 10:09:40.151 229250 INFO nova.virt.libvirt.driver [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Deleting instance files /var/lib/nova/instances/f06564b9-c316-4167-8633-00e6af858c3b_del#033[00m
Dec  6 05:09:40 np0005548918 nova_compute[229246]: 2025-12-06 10:09:40.152 229250 INFO nova.virt.libvirt.driver [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Deletion of /var/lib/nova/instances/f06564b9-c316-4167-8633-00e6af858c3b_del complete#033[00m
Dec  6 05:09:40 np0005548918 nova_compute[229246]: 2025-12-06 10:09:40.215 229250 INFO nova.compute.manager [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Took 1.40 seconds to destroy the instance on the hypervisor.#033[00m
Dec  6 05:09:40 np0005548918 nova_compute[229246]: 2025-12-06 10:09:40.216 229250 DEBUG oslo.service.loopingcall [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  6 05:09:40 np0005548918 nova_compute[229246]: 2025-12-06 10:09:40.216 229250 DEBUG nova.compute.manager [-] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  6 05:09:40 np0005548918 nova_compute[229246]: 2025-12-06 10:09:40.217 229250 DEBUG nova.network.neutron [-] [instance: f06564b9-c316-4167-8633-00e6af858c3b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  6 05:09:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/100940 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:09:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:40.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:40.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:40 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:41 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:09:41 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4286393203' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:09:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:41 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:41 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380002e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:41 np0005548918 nova_compute[229246]: 2025-12-06 10:09:41.649 229250 DEBUG nova.compute.manager [req-f0d47a69-5f63-400b-8c51-415dc69b4946 req-fd813e46-51db-40f4-b493-0f6f6fe47b7d d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Received event network-vif-plugged-dada49f9-c005-4fb8-845a-9a4e59c236ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:09:41 np0005548918 nova_compute[229246]: 2025-12-06 10:09:41.650 229250 DEBUG oslo_concurrency.lockutils [req-f0d47a69-5f63-400b-8c51-415dc69b4946 req-fd813e46-51db-40f4-b493-0f6f6fe47b7d d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "f06564b9-c316-4167-8633-00e6af858c3b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:41 np0005548918 nova_compute[229246]: 2025-12-06 10:09:41.650 229250 DEBUG oslo_concurrency.lockutils [req-f0d47a69-5f63-400b-8c51-415dc69b4946 req-fd813e46-51db-40f4-b493-0f6f6fe47b7d d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:41 np0005548918 nova_compute[229246]: 2025-12-06 10:09:41.651 229250 DEBUG oslo_concurrency.lockutils [req-f0d47a69-5f63-400b-8c51-415dc69b4946 req-fd813e46-51db-40f4-b493-0f6f6fe47b7d d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:41 np0005548918 nova_compute[229246]: 2025-12-06 10:09:41.651 229250 DEBUG nova.compute.manager [req-f0d47a69-5f63-400b-8c51-415dc69b4946 req-fd813e46-51db-40f4-b493-0f6f6fe47b7d d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] No waiting events found dispatching network-vif-plugged-dada49f9-c005-4fb8-845a-9a4e59c236ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:09:41 np0005548918 nova_compute[229246]: 2025-12-06 10:09:41.652 229250 WARNING nova.compute.manager [req-f0d47a69-5f63-400b-8c51-415dc69b4946 req-fd813e46-51db-40f4-b493-0f6f6fe47b7d d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Received unexpected event network-vif-plugged-dada49f9-c005-4fb8-845a-9a4e59c236ce for instance with vm_state active and task_state deleting.#033[00m
Dec  6 05:09:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:42 np0005548918 nova_compute[229246]: 2025-12-06 10:09:42.006 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:42 np0005548918 nova_compute[229246]: 2025-12-06 10:09:42.006 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:42 np0005548918 nova_compute[229246]: 2025-12-06 10:09:42.007 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:42 np0005548918 nova_compute[229246]: 2025-12-06 10:09:42.007 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:09:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:42.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:42.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:42 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:43 np0005548918 nova_compute[229246]: 2025-12-06 10:09:43.008 229250 DEBUG nova.network.neutron [-] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:09:43 np0005548918 nova_compute[229246]: 2025-12-06 10:09:43.049 229250 INFO nova.compute.manager [-] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Took 2.83 seconds to deallocate network for instance.#033[00m
Dec  6 05:09:43 np0005548918 nova_compute[229246]: 2025-12-06 10:09:43.120 229250 DEBUG oslo_concurrency.lockutils [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:43 np0005548918 nova_compute[229246]: 2025-12-06 10:09:43.121 229250 DEBUG oslo_concurrency.lockutils [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:43 np0005548918 nova_compute[229246]: 2025-12-06 10:09:43.124 229250 DEBUG nova.compute.manager [req-5dcd490e-c6fa-47ce-95ca-7db0db8ea589 req-aa636876-dca1-40aa-9c6c-5b284cc72f71 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Received event network-vif-deleted-dada49f9-c005-4fb8-845a-9a4e59c236ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:09:43 np0005548918 nova_compute[229246]: 2025-12-06 10:09:43.176 229250 DEBUG oslo_concurrency.processutils [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:09:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:43 np0005548918 nova_compute[229246]: 2025-12-06 10:09:43.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:43 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:43 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:43 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:09:43 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3729956179' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:09:43 np0005548918 nova_compute[229246]: 2025-12-06 10:09:43.666 229250 DEBUG oslo_concurrency.processutils [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:09:43 np0005548918 nova_compute[229246]: 2025-12-06 10:09:43.672 229250 DEBUG nova.compute.provider_tree [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:09:43 np0005548918 nova_compute[229246]: 2025-12-06 10:09:43.938 229250 DEBUG nova.scheduler.client.report [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:09:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:44 np0005548918 nova_compute[229246]: 2025-12-06 10:09:44.104 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:44 np0005548918 nova_compute[229246]: 2025-12-06 10:09:44.235 229250 DEBUG oslo_concurrency.lockutils [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:44 np0005548918 nova_compute[229246]: 2025-12-06 10:09:44.263 229250 INFO nova.scheduler.client.report [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Deleted allocations for instance f06564b9-c316-4167-8633-00e6af858c3b#033[00m
Dec  6 05:09:44 np0005548918 nova_compute[229246]: 2025-12-06 10:09:44.319 229250 DEBUG oslo_concurrency.lockutils [None req-65d925e4-52c6-4fa7-9c1e-541bd7f32da6 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "f06564b9-c316-4167-8633-00e6af858c3b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.513s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:44 np0005548918 nova_compute[229246]: 2025-12-06 10:09:44.396 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:44 np0005548918 nova_compute[229246]: 2025-12-06 10:09:44.530 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:44.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:44.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:44 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380002e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:45 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:45 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  6 05:09:45 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2942484271' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  6 05:09:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  6 05:09:45 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2942484271' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  6 05:09:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:46 np0005548918 podman[235583]: 2025-12-06 10:09:46.196388895 +0000 UTC m=+0.077409267 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec  6 05:09:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:46 np0005548918 nova_compute[229246]: 2025-12-06 10:09:46.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:46 np0005548918 nova_compute[229246]: 2025-12-06 10:09:46.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:09:46 np0005548918 nova_compute[229246]: 2025-12-06 10:09:46.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:09:46 np0005548918 nova_compute[229246]: 2025-12-06 10:09:46.553 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:09:46 np0005548918 nova_compute[229246]: 2025-12-06 10:09:46.554 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:46 np0005548918 nova_compute[229246]: 2025-12-06 10:09:46.554 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:46 np0005548918 nova_compute[229246]: 2025-12-06 10:09:46.555 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:09:46 np0005548918 nova_compute[229246]: 2025-12-06 10:09:46.580 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:46 np0005548918 nova_compute[229246]: 2025-12-06 10:09:46.580 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:46 np0005548918 nova_compute[229246]: 2025-12-06 10:09:46.580 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:46 np0005548918 nova_compute[229246]: 2025-12-06 10:09:46.581 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:09:46 np0005548918 nova_compute[229246]: 2025-12-06 10:09:46.581 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:09:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:09:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:46.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:09:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:09:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:46.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:09:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:46 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0047e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:47 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:09:47 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1962556856' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:09:47 np0005548918 nova_compute[229246]: 2025-12-06 10:09:47.031 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:09:47 np0005548918 nova_compute[229246]: 2025-12-06 10:09:47.205 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:09:47 np0005548918 nova_compute[229246]: 2025-12-06 10:09:47.206 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4793MB free_disk=59.94268798828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:09:47 np0005548918 nova_compute[229246]: 2025-12-06 10:09:47.206 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:47 np0005548918 nova_compute[229246]: 2025-12-06 10:09:47.207 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:47 np0005548918 nova_compute[229246]: 2025-12-06 10:09:47.267 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:09:47 np0005548918 nova_compute[229246]: 2025-12-06 10:09:47.267 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:09:47 np0005548918 nova_compute[229246]: 2025-12-06 10:09:47.294 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:09:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:47 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380002e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:47 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004140 fd 50 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:48 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:09:48 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3467548251' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:09:48 np0005548918 nova_compute[229246]: 2025-12-06 10:09:48.203 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.909s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:09:48 np0005548918 nova_compute[229246]: 2025-12-06 10:09:48.209 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:09:48 np0005548918 nova_compute[229246]: 2025-12-06 10:09:48.233 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:09:48 np0005548918 nova_compute[229246]: 2025-12-06 10:09:48.254 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:09:48 np0005548918 nova_compute[229246]: 2025-12-06 10:09:48.255 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.048s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:09:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:48.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:09:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:48.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:48 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380002e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:49 : epoch 69340044 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:09:49 np0005548918 nova_compute[229246]: 2025-12-06 10:09:49.108 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:49 np0005548918 nova_compute[229246]: 2025-12-06 10:09:49.427 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:49 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:49 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:50.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:50.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:50 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:51 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380002e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:51 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:52 : epoch 69340044 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:09:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:52 : epoch 69340044 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:09:52 np0005548918 podman[235657]: 2025-12-06 10:09:52.194904537 +0000 UTC m=+0.083540323 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  6 05:09:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:52.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:09:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:52.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:09:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:52 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:53 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:53 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380002e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:53.676 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:09:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:53.678 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:09:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:09:53.678 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:09:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:54 np0005548918 nova_compute[229246]: 2025-12-06 10:09:54.055 229250 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765015779.0529318, f06564b9-c316-4167-8633-00e6af858c3b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:09:54 np0005548918 nova_compute[229246]: 2025-12-06 10:09:54.055 229250 INFO nova.compute.manager [-] [instance: f06564b9-c316-4167-8633-00e6af858c3b] VM Stopped (Lifecycle Event)#033[00m
Dec  6 05:09:54 np0005548918 nova_compute[229246]: 2025-12-06 10:09:54.077 229250 DEBUG nova.compute.manager [None req-47cb31d7-f800-4bc4-8f6a-be7ff11ebcb7 - - - - - -] [instance: f06564b9-c316-4167-8633-00e6af858c3b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:09:54 np0005548918 nova_compute[229246]: 2025-12-06 10:09:54.143 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:54 np0005548918 nova_compute[229246]: 2025-12-06 10:09:54.429 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:09:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:54.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:09:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:54.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:54 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:09:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:55 : epoch 69340044 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:09:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:55 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:55 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:56 np0005548918 nova_compute[229246]: 2025-12-06 10:09:56.295 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:56 np0005548918 nova_compute[229246]: 2025-12-06 10:09:56.413 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:09:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:56.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:09:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:56.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:56 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:57 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:57 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:58.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:09:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:09:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:58.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:09:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:58 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:59 np0005548918 nova_compute[229246]: 2025-12-06 10:09:59.147 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:09:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:09:59 np0005548918 nova_compute[229246]: 2025-12-06 10:09:59.431 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:09:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:59 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:09:59 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:09:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:09:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/101000 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:10:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:00.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:00.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:00 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:00 np0005548918 ceph-mon[75798]: Health detail: HEALTH_WARN 1 failed cephadm daemon(s)
Dec  6 05:10:00 np0005548918 ceph-mon[75798]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s)
Dec  6 05:10:00 np0005548918 ceph-mon[75798]:    daemon nfs.cephfs.2.0.compute-0.dfwxck on compute-0 is in unknown state
Dec  6 05:10:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:01 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:01 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:02.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:02.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:02 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360003fa0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:03 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360003fa0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:03 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:04 np0005548918 nova_compute[229246]: 2025-12-06 10:10:04.151 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:04 np0005548918 nova_compute[229246]: 2025-12-06 10:10:04.471 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:10:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:04.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:10:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:04.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:04 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:05 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004200 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:05 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360003fa0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:10:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:06.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:10:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:10:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:06.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:10:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:06 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:07 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:07 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004200 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:08.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:10:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:08.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:10:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:08 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360003fa0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:09 np0005548918 nova_compute[229246]: 2025-12-06 10:10:09.154 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:09 np0005548918 podman[235720]: 2025-12-06 10:10:09.252171933 +0000 UTC m=+0.128691295 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  6 05:10:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:09 np0005548918 nova_compute[229246]: 2025-12-06 10:10:09.473 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:09 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:09 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:10 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:10:10 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.5 total, 600.0 interval#012Cumulative writes: 5182 writes, 27K keys, 5182 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s#012Cumulative WAL: 5182 writes, 5182 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1494 writes, 7274 keys, 1494 commit groups, 1.0 writes per commit group, ingest: 16.85 MB, 0.03 MB/s#012Interval WAL: 1495 writes, 1495 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     43.0      0.89              0.12        14    0.064       0      0       0.0       0.0#012  L6      1/0   12.72 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.4     99.1     86.1      1.97              0.43        13    0.152     67K   6727       0.0       0.0#012 Sum      1/0   12.72 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   5.4     68.2     72.6      2.87              0.55        27    0.106     67K   6727       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.9     99.8     99.0      0.76              0.19        10    0.076     29K   2587       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0     99.1     86.1      1.97              0.43        13    0.152     67K   6727       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     72.0      0.53              0.12        13    0.041       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.36              0.00         1    0.360       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.5 total, 600.0 interval#012Flush(GB): cumulative 0.037, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.20 GB write, 0.12 MB/s write, 0.19 GB read, 0.11 MB/s read, 2.9 seconds#012Interval compaction: 0.07 GB write, 0.13 MB/s write, 0.07 GB read, 0.13 MB/s read, 0.8 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55784c777350#2 capacity: 304.00 MB usage: 13.53 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000132 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(718,12.98 MB,4.27082%) FilterBlock(27,201.92 KB,0.0648649%) IndexBlock(27,355.77 KB,0.114285%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  6 05:10:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:10:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:10.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:10:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:10:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:10.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:10:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:10 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004200 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:11 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360003fa0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:11 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:10:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:12.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:10:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:12.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:12 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:13 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004200 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:13 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360003fa0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:14 np0005548918 nova_compute[229246]: 2025-12-06 10:10:14.159 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:14 np0005548918 nova_compute[229246]: 2025-12-06 10:10:14.501 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:10:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:14.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:10:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:14.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:14 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:15 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:15 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004200 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:10:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:16.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:10:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:16.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:16 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc360003fa0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:17 np0005548918 podman[235779]: 2025-12-06 10:10:17.203265755 +0000 UTC m=+0.085528855 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  6 05:10:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:17 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:17 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004200 fd 50 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:10:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:18.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:10:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:18.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:18 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:19 np0005548918 nova_compute[229246]: 2025-12-06 10:10:19.162 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:19 np0005548918 nova_compute[229246]: 2025-12-06 10:10:19.531 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:19 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0047e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:19 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350001090 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:20 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:10:20.410 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:10:20 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:10:20.411 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:10:20 np0005548918 nova_compute[229246]: 2025-12-06 10:10:20.412 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:20.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:20.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:20 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004200 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:21 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0047e0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:21 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004200 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:22.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:22.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:22 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390002010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:23 np0005548918 podman[235830]: 2025-12-06 10:10:23.202456896 +0000 UTC m=+0.075398142 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd)
Dec  6 05:10:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:23 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004200 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:23 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350001090 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:23 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:23 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:24 np0005548918 nova_compute[229246]: 2025-12-06 10:10:24.166 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:24 np0005548918 nova_compute[229246]: 2025-12-06 10:10:24.533 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:10:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:24.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:10:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:10:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:24.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:10:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:24 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0047e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:25 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:25 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390002010 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:25 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:25 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:26.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:26.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:26 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390002010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:27 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:27 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0047e0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:27 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:27 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:27 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:10:27 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:27 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:27 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:10:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:28.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:28.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:28 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:29 np0005548918 nova_compute[229246]: 2025-12-06 10:10:29.169 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:29 np0005548918 nova_compute[229246]: 2025-12-06 10:10:29.564 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:29 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:29 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390002010 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:30 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:10:30.414 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:10:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:30.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:30.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:30 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:31 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0047e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:31 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0047e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:10:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:32.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:10:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:32.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:10:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:32 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390002010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:33 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:33 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:34 np0005548918 nova_compute[229246]: 2025-12-06 10:10:34.172 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:34 np0005548918 ovn_controller[132371]: 2025-12-06T10:10:34Z|00053|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Dec  6 05:10:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:34 np0005548918 nova_compute[229246]: 2025-12-06 10:10:34.603 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:34.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:34.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:34 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0047e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:35 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:35 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350001090 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:36.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:10:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:36.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:10:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:36 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:37 np0005548918 nova_compute[229246]: 2025-12-06 10:10:37.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:37 np0005548918 nova_compute[229246]: 2025-12-06 10:10:37.537 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  6 05:10:37 np0005548918 nova_compute[229246]: 2025-12-06 10:10:37.554 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  6 05:10:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:37 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:37 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:38.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:38.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:39 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:39 np0005548918 nova_compute[229246]: 2025-12-06 10:10:39.175 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:39 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:39 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:39 np0005548918 nova_compute[229246]: 2025-12-06 10:10:39.644 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:40 np0005548918 podman[236048]: 2025-12-06 10:10:40.217919739 +0000 UTC m=+0.106392361 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  6 05:10:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:40 np0005548918 nova_compute[229246]: 2025-12-06 10:10:40.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:40 np0005548918 nova_compute[229246]: 2025-12-06 10:10:40.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  6 05:10:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:40.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:40.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:41 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:41 np0005548918 nova_compute[229246]: 2025-12-06 10:10:41.550 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:41 np0005548918 nova_compute[229246]: 2025-12-06 10:10:41.550 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:41 np0005548918 nova_compute[229246]: 2025-12-06 10:10:41.550 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:10:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:41 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:41 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:42 np0005548918 nova_compute[229246]: 2025-12-06 10:10:42.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:42.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:10:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:42.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:10:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:43 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:43 np0005548918 nova_compute[229246]: 2025-12-06 10:10:43.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:43 np0005548918 nova_compute[229246]: 2025-12-06 10:10:43.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:43 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:43 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc390009cd0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:44 np0005548918 nova_compute[229246]: 2025-12-06 10:10:44.179 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:44 np0005548918 nova_compute[229246]: 2025-12-06 10:10:44.689 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:44.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:10:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:44.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:10:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:45 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:45 np0005548918 nova_compute[229246]: 2025-12-06 10:10:45.542 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:46 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0047e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:46 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:46 np0005548918 nova_compute[229246]: 2025-12-06 10:10:46.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:46 np0005548918 nova_compute[229246]: 2025-12-06 10:10:46.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:46 np0005548918 nova_compute[229246]: 2025-12-06 10:10:46.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:46 np0005548918 nova_compute[229246]: 2025-12-06 10:10:46.560 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:10:46 np0005548918 nova_compute[229246]: 2025-12-06 10:10:46.561 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:10:46 np0005548918 nova_compute[229246]: 2025-12-06 10:10:46.561 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:10:46 np0005548918 nova_compute[229246]: 2025-12-06 10:10:46.561 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:10:46 np0005548918 nova_compute[229246]: 2025-12-06 10:10:46.562 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:10:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:46.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:46.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:10:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3587187352' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:10:46 np0005548918 nova_compute[229246]: 2025-12-06 10:10:46.995 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:10:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:47 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0047e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:47 np0005548918 nova_compute[229246]: 2025-12-06 10:10:47.152 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:10:47 np0005548918 nova_compute[229246]: 2025-12-06 10:10:47.154 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4876MB free_disk=59.94276428222656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:10:47 np0005548918 nova_compute[229246]: 2025-12-06 10:10:47.154 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:10:47 np0005548918 nova_compute[229246]: 2025-12-06 10:10:47.154 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:10:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:47 np0005548918 nova_compute[229246]: 2025-12-06 10:10:47.398 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:10:47 np0005548918 nova_compute[229246]: 2025-12-06 10:10:47.398 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:10:47 np0005548918 nova_compute[229246]: 2025-12-06 10:10:47.450 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:10:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:47 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:47 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:47 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:10:47 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2299801742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:10:47 np0005548918 nova_compute[229246]: 2025-12-06 10:10:47.869 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:10:47 np0005548918 nova_compute[229246]: 2025-12-06 10:10:47.874 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:10:47 np0005548918 nova_compute[229246]: 2025-12-06 10:10:47.891 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:10:47 np0005548918 nova_compute[229246]: 2025-12-06 10:10:47.893 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:10:47 np0005548918 nova_compute[229246]: 2025-12-06 10:10:47.893 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.739s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:10:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:48 np0005548918 podman[236126]: 2025-12-06 10:10:48.172003493 +0000 UTC m=+0.051946477 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  6 05:10:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:48.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:48.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:48 np0005548918 nova_compute[229246]: 2025-12-06 10:10:48.892 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:48 np0005548918 nova_compute[229246]: 2025-12-06 10:10:48.893 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:10:48 np0005548918 nova_compute[229246]: 2025-12-06 10:10:48.893 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:10:48 np0005548918 nova_compute[229246]: 2025-12-06 10:10:48.968 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:10:48 np0005548918 nova_compute[229246]: 2025-12-06 10:10:48.969 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:10:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:49 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:49 np0005548918 nova_compute[229246]: 2025-12-06 10:10:49.182 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:49 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c0047e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:49 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:49 np0005548918 nova_compute[229246]: 2025-12-06 10:10:49.722 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:50.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:50.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:51 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:51 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:51 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368002f80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:52.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:52.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:53 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:53 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368002f80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:53 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:10:53.678 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:10:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:10:53.678 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:10:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:10:53.678 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:10:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:54 np0005548918 nova_compute[229246]: 2025-12-06 10:10:54.184 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:54 np0005548918 podman[236177]: 2025-12-06 10:10:54.189479128 +0000 UTC m=+0.074426695 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  6 05:10:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:54 np0005548918 nova_compute[229246]: 2025-12-06 10:10:54.724 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:54.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:54.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:54 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Dec  6 05:10:54 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:54.995882) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:10:54 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Dec  6 05:10:54 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015854995915, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2371, "num_deletes": 251, "total_data_size": 6486955, "memory_usage": 6571744, "flush_reason": "Manual Compaction"}
Dec  6 05:10:54 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Dec  6 05:10:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:55 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855037279, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 4184524, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26077, "largest_seqno": 28443, "table_properties": {"data_size": 4174854, "index_size": 6100, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20016, "raw_average_key_size": 20, "raw_value_size": 4155559, "raw_average_value_size": 4236, "num_data_blocks": 267, "num_entries": 981, "num_filter_entries": 981, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015644, "oldest_key_time": 1765015644, "file_creation_time": 1765015854, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 41452 microseconds, and 7824 cpu microseconds.
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.037328) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 4184524 bytes OK
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.037350) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.039364) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.039376) EVENT_LOG_v1 {"time_micros": 1765015855039373, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.039391) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6476383, prev total WAL file size 6476383, number of live WAL files 2.
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.040556) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(4086KB)], [51(12MB)]
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855040616, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 17518478, "oldest_snapshot_seqno": -1}
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5893 keys, 15448125 bytes, temperature: kUnknown
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855191727, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 15448125, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15407341, "index_size": 24930, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14789, "raw_key_size": 149712, "raw_average_key_size": 25, "raw_value_size": 15299364, "raw_average_value_size": 2596, "num_data_blocks": 1018, "num_entries": 5893, "num_filter_entries": 5893, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765015855, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.191953) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 15448125 bytes
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.193430) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 115.9 rd, 102.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.7 +0.0 blob) out(14.7 +0.0 blob), read-write-amplify(7.9) write-amplify(3.7) OK, records in: 6413, records dropped: 520 output_compression: NoCompression
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.193447) EVENT_LOG_v1 {"time_micros": 1765015855193439, "job": 30, "event": "compaction_finished", "compaction_time_micros": 151183, "compaction_time_cpu_micros": 28294, "output_level": 6, "num_output_files": 1, "total_output_size": 15448125, "num_input_records": 6413, "num_output_records": 5893, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855194177, "job": 30, "event": "table_file_deletion", "file_number": 53}
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855196547, "job": 30, "event": "table_file_deletion", "file_number": 51}
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.040466) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.196650) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.196656) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.196658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.196661) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:10:55 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:10:55.196665) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:10:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:55 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:55 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368002f80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:10:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:56.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:10:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:56.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:57 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:57 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:57 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:58.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:10:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:10:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:58.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:10:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:59 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368002f80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:59 np0005548918 nova_compute[229246]: 2025-12-06 10:10:59.188 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:10:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:10:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:59 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:10:59 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:10:59 np0005548918 nova_compute[229246]: 2025-12-06 10:10:59.741 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:10:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:10:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:10:59.999 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "000b9af7-febc-46b4-801c-39d129655fbe" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.000 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.016 229250 DEBUG nova.compute.manager [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  6 05:11:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.152 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.153 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.163 229250 DEBUG nova.virt.hardware [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.164 229250 INFO nova.compute.claims [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.281 229250 DEBUG oslo_concurrency.processutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:11:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:00.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.770 229250 DEBUG oslo_concurrency.processutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.776 229250 DEBUG nova.compute.provider_tree [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.795 229250 DEBUG nova.scheduler.client.report [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:11:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:00.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.820 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.821 229250 DEBUG nova.compute.manager [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.861 229250 DEBUG nova.compute.manager [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.862 229250 DEBUG nova.network.neutron [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.881 229250 INFO nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  6 05:11:00 np0005548918 nova_compute[229246]: 2025-12-06 10:11:00.901 229250 DEBUG nova.compute.manager [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  6 05:11:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.009 229250 DEBUG nova.compute.manager [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.010 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.010 229250 INFO nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Creating image(s)#033[00m
Dec  6 05:11:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:01 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.045 229250 DEBUG nova.storage.rbd_utils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image 000b9af7-febc-46b4-801c-39d129655fbe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.080 229250 DEBUG nova.storage.rbd_utils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image 000b9af7-febc-46b4-801c-39d129655fbe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.115 229250 DEBUG nova.storage.rbd_utils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image 000b9af7-febc-46b4-801c-39d129655fbe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.120 229250 DEBUG oslo_concurrency.processutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.152 229250 DEBUG nova.policy [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '03615580775245e6ae335ee9d785611f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '92b402c8d3e2476abc98be42a1e6d34e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.172 229250 DEBUG oslo_concurrency.processutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.172 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "1b7208203e670301d076a006cb3364d3eb842050" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.173 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "1b7208203e670301d076a006cb3364d3eb842050" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.173 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "1b7208203e670301d076a006cb3364d3eb842050" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.202 229250 DEBUG nova.storage.rbd_utils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image 000b9af7-febc-46b4-801c-39d129655fbe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.206 229250 DEBUG oslo_concurrency.processutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050 000b9af7-febc-46b4-801c-39d129655fbe_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:11:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.493 229250 DEBUG oslo_concurrency.processutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050 000b9af7-febc-46b4-801c-39d129655fbe_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.287s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.581 229250 DEBUG nova.storage.rbd_utils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] resizing rbd image 000b9af7-febc-46b4-801c-39d129655fbe_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Dec  6 05:11:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:01 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368002f80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:01 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.721 229250 DEBUG nova.objects.instance [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lazy-loading 'migration_context' on Instance uuid 000b9af7-febc-46b4-801c-39d129655fbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.735 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.736 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Ensure instance console log exists: /var/lib/nova/instances/000b9af7-febc-46b4-801c-39d129655fbe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.736 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.737 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:01 np0005548918 nova_compute[229246]: 2025-12-06 10:11:01.738 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:02 np0005548918 nova_compute[229246]: 2025-12-06 10:11:02.362 229250 DEBUG nova.network.neutron [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Successfully created port: 39cc9025-b8be-492b-82cf-765dd17ecbc2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  6 05:11:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:11:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 8458 writes, 34K keys, 8458 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 8458 writes, 2073 syncs, 4.08 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2428 writes, 9161 keys, 2428 commit groups, 1.0 writes per commit group, ingest: 9.91 MB, 0.02 MB/s#012Interval WAL: 2428 writes, 973 syncs, 2.50 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  6 05:11:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:02.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:11:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:02.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:11:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:03 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:03 np0005548918 nova_compute[229246]: 2025-12-06 10:11:03.197 229250 DEBUG nova.network.neutron [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Successfully updated port: 39cc9025-b8be-492b-82cf-765dd17ecbc2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  6 05:11:03 np0005548918 nova_compute[229246]: 2025-12-06 10:11:03.223 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "refresh_cache-000b9af7-febc-46b4-801c-39d129655fbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:11:03 np0005548918 nova_compute[229246]: 2025-12-06 10:11:03.223 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquired lock "refresh_cache-000b9af7-febc-46b4-801c-39d129655fbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:11:03 np0005548918 nova_compute[229246]: 2025-12-06 10:11:03.223 229250 DEBUG nova.network.neutron [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  6 05:11:03 np0005548918 nova_compute[229246]: 2025-12-06 10:11:03.286 229250 DEBUG nova.compute.manager [req-be12d95c-9a47-4f7a-a2bd-4a6de84eef60 req-c501b821-7d44-40e3-8248-234e47a8b4b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Received event network-changed-39cc9025-b8be-492b-82cf-765dd17ecbc2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:11:03 np0005548918 nova_compute[229246]: 2025-12-06 10:11:03.287 229250 DEBUG nova.compute.manager [req-be12d95c-9a47-4f7a-a2bd-4a6de84eef60 req-c501b821-7d44-40e3-8248-234e47a8b4b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Refreshing instance network info cache due to event network-changed-39cc9025-b8be-492b-82cf-765dd17ecbc2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  6 05:11:03 np0005548918 nova_compute[229246]: 2025-12-06 10:11:03.287 229250 DEBUG oslo_concurrency.lockutils [req-be12d95c-9a47-4f7a-a2bd-4a6de84eef60 req-c501b821-7d44-40e3-8248-234e47a8b4b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "refresh_cache-000b9af7-febc-46b4-801c-39d129655fbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:11:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:03 np0005548918 nova_compute[229246]: 2025-12-06 10:11:03.380 229250 DEBUG nova.network.neutron [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  6 05:11:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:03 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:03 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.194 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.690 229250 DEBUG nova.network.neutron [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Updating instance_info_cache with network_info: [{"id": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "address": "fa:16:3e:a1:92:1a", "network": {"id": "af11da89-c29d-4ef1-80d5-4b619757b0ff", "bridge": "br-int", "label": "tempest-network-smoke--2039147327", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39cc9025-b8", "ovs_interfaceid": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.708 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Releasing lock "refresh_cache-000b9af7-febc-46b4-801c-39d129655fbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.708 229250 DEBUG nova.compute.manager [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Instance network_info: |[{"id": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "address": "fa:16:3e:a1:92:1a", "network": {"id": "af11da89-c29d-4ef1-80d5-4b619757b0ff", "bridge": "br-int", "label": "tempest-network-smoke--2039147327", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39cc9025-b8", "ovs_interfaceid": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.708 229250 DEBUG oslo_concurrency.lockutils [req-be12d95c-9a47-4f7a-a2bd-4a6de84eef60 req-c501b821-7d44-40e3-8248-234e47a8b4b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquired lock "refresh_cache-000b9af7-febc-46b4-801c-39d129655fbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.708 229250 DEBUG nova.network.neutron [req-be12d95c-9a47-4f7a-a2bd-4a6de84eef60 req-c501b821-7d44-40e3-8248-234e47a8b4b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Refreshing network info cache for port 39cc9025-b8be-492b-82cf-765dd17ecbc2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.710 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Start _get_guest_xml network_info=[{"id": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "address": "fa:16:3e:a1:92:1a", "network": {"id": "af11da89-c29d-4ef1-80d5-4b619757b0ff", "bridge": "br-int", "label": "tempest-network-smoke--2039147327", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39cc9025-b8", "ovs_interfaceid": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:04:42Z,direct_url=<?>,disk_format='qcow2',id=9489b8a5-a798-4e26-87f9-59bb1eb2e6fd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3e0ab101ca7547d4a515169a0f2edef3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-06T10:04:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '9489b8a5-a798-4e26-87f9-59bb1eb2e6fd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.715 229250 WARNING nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.718 229250 DEBUG nova.virt.libvirt.host [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.719 229250 DEBUG nova.virt.libvirt.host [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.722 229250 DEBUG nova.virt.libvirt.host [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.722 229250 DEBUG nova.virt.libvirt.host [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.723 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.723 229250 DEBUG nova.virt.hardware [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-06T10:04:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0a252b9c-cc5f-41b2-a8b2-94fcf6e74d22',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:04:42Z,direct_url=<?>,disk_format='qcow2',id=9489b8a5-a798-4e26-87f9-59bb1eb2e6fd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3e0ab101ca7547d4a515169a0f2edef3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-06T10:04:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.723 229250 DEBUG nova.virt.hardware [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.724 229250 DEBUG nova.virt.hardware [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.724 229250 DEBUG nova.virt.hardware [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.724 229250 DEBUG nova.virt.hardware [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.724 229250 DEBUG nova.virt.hardware [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.725 229250 DEBUG nova.virt.hardware [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.725 229250 DEBUG nova.virt.hardware [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.725 229250 DEBUG nova.virt.hardware [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.725 229250 DEBUG nova.virt.hardware [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.726 229250 DEBUG nova.virt.hardware [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.729 229250 DEBUG oslo_concurrency.processutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:11:04 np0005548918 nova_compute[229246]: 2025-12-06 10:11:04.748 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:11:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:04.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:11:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:04.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:05 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  6 05:11:05 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/737373580' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.175 229250 DEBUG oslo_concurrency.processutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.210 229250 DEBUG nova.storage.rbd_utils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image 000b9af7-febc-46b4-801c-39d129655fbe_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.217 229250 DEBUG oslo_concurrency.processutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:11:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:05 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:05 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  6 05:11:05 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/301725092' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.735 229250 DEBUG oslo_concurrency.processutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.738 229250 DEBUG nova.virt.libvirt.vif [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T10:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-853218052',display_name='tempest-TestNetworkBasicOps-server-853218052',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-853218052',id=7,image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGL7vrn/akoHm9oxpuwHP+G/0FFkmDNLMfz6CI4NHhMnbsIbZkr92wnPjE4LFBAK7gBqIQZLYPOJYhqmy5T4tOaqFBwfEmlBVOAhU3ks9etlTi/suReVUCKKU9J6iWJ42g==',key_name='tempest-TestNetworkBasicOps-2140451550',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='92b402c8d3e2476abc98be42a1e6d34e',ramdisk_id='',reservation_id='r-zgkq310b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1971100882',owner_user_name='tempest-TestNetworkBasicOps-1971100882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:11:00Z,user_data=None,user_id='03615580775245e6ae335ee9d785611f',uuid=000b9af7-febc-46b4-801c-39d129655fbe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "address": "fa:16:3e:a1:92:1a", "network": {"id": "af11da89-c29d-4ef1-80d5-4b619757b0ff", "bridge": "br-int", "label": "tempest-network-smoke--2039147327", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39cc9025-b8", "ovs_interfaceid": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.739 229250 DEBUG nova.network.os_vif_util [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converting VIF {"id": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "address": "fa:16:3e:a1:92:1a", "network": {"id": "af11da89-c29d-4ef1-80d5-4b619757b0ff", "bridge": "br-int", "label": "tempest-network-smoke--2039147327", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39cc9025-b8", "ovs_interfaceid": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.740 229250 DEBUG nova.network.os_vif_util [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:92:1a,bridge_name='br-int',has_traffic_filtering=True,id=39cc9025-b8be-492b-82cf-765dd17ecbc2,network=Network(af11da89-c29d-4ef1-80d5-4b619757b0ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39cc9025-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.743 229250 DEBUG nova.objects.instance [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lazy-loading 'pci_devices' on Instance uuid 000b9af7-febc-46b4-801c-39d129655fbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.775 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] End _get_guest_xml xml=<domain type="kvm">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  <uuid>000b9af7-febc-46b4-801c-39d129655fbe</uuid>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  <name>instance-00000007</name>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  <memory>131072</memory>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  <vcpu>1</vcpu>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  <metadata>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <nova:name>tempest-TestNetworkBasicOps-server-853218052</nova:name>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <nova:creationTime>2025-12-06 10:11:04</nova:creationTime>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <nova:flavor name="m1.nano">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <nova:memory>128</nova:memory>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <nova:disk>1</nova:disk>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <nova:swap>0</nova:swap>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <nova:ephemeral>0</nova:ephemeral>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <nova:vcpus>1</nova:vcpus>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      </nova:flavor>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <nova:owner>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <nova:user uuid="03615580775245e6ae335ee9d785611f">tempest-TestNetworkBasicOps-1971100882-project-member</nova:user>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <nova:project uuid="92b402c8d3e2476abc98be42a1e6d34e">tempest-TestNetworkBasicOps-1971100882</nova:project>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      </nova:owner>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <nova:root type="image" uuid="9489b8a5-a798-4e26-87f9-59bb1eb2e6fd"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <nova:ports>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <nova:port uuid="39cc9025-b8be-492b-82cf-765dd17ecbc2">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:          <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        </nova:port>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      </nova:ports>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    </nova:instance>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  </metadata>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  <sysinfo type="smbios">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <system>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <entry name="manufacturer">RDO</entry>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <entry name="product">OpenStack Compute</entry>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <entry name="serial">000b9af7-febc-46b4-801c-39d129655fbe</entry>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <entry name="uuid">000b9af7-febc-46b4-801c-39d129655fbe</entry>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <entry name="family">Virtual Machine</entry>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    </system>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  </sysinfo>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  <os>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <boot dev="hd"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <smbios mode="sysinfo"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  </os>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  <features>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <acpi/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <apic/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <vmcoreinfo/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  </features>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  <clock offset="utc">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <timer name="pit" tickpolicy="delay"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <timer name="hpet" present="no"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  </clock>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  <cpu mode="host-model" match="exact">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <topology sockets="1" cores="1" threads="1"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  </cpu>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  <devices>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <disk type="network" device="disk">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <driver type="raw" cache="none"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <source protocol="rbd" name="vms/000b9af7-febc-46b4-801c-39d129655fbe_disk">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <host name="192.168.122.100" port="6789"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <host name="192.168.122.102" port="6789"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <host name="192.168.122.101" port="6789"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      </source>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <auth username="openstack">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <secret type="ceph" uuid="5ecd3f74-dade-5fc4-92ce-8950ae424258"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      </auth>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <target dev="vda" bus="virtio"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    </disk>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <disk type="network" device="cdrom">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <driver type="raw" cache="none"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <source protocol="rbd" name="vms/000b9af7-febc-46b4-801c-39d129655fbe_disk.config">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <host name="192.168.122.100" port="6789"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <host name="192.168.122.102" port="6789"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <host name="192.168.122.101" port="6789"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      </source>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <auth username="openstack">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:        <secret type="ceph" uuid="5ecd3f74-dade-5fc4-92ce-8950ae424258"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      </auth>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <target dev="sda" bus="sata"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    </disk>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <interface type="ethernet">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <mac address="fa:16:3e:a1:92:1a"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <model type="virtio"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <driver name="vhost" rx_queue_size="512"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <mtu size="1442"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <target dev="tap39cc9025-b8"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    </interface>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <serial type="pty">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <log file="/var/lib/nova/instances/000b9af7-febc-46b4-801c-39d129655fbe/console.log" append="off"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    </serial>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <video>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <model type="virtio"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    </video>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <input type="tablet" bus="usb"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <rng model="virtio">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <backend model="random">/dev/urandom</backend>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    </rng>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <controller type="usb" index="0"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    <memballoon model="virtio">
Dec  6 05:11:05 np0005548918 nova_compute[229246]:      <stats period="10"/>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:    </memballoon>
Dec  6 05:11:05 np0005548918 nova_compute[229246]:  </devices>
Dec  6 05:11:05 np0005548918 nova_compute[229246]: </domain>
Dec  6 05:11:05 np0005548918 nova_compute[229246]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.777 229250 DEBUG nova.compute.manager [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Preparing to wait for external event network-vif-plugged-39cc9025-b8be-492b-82cf-765dd17ecbc2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.777 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "000b9af7-febc-46b4-801c-39d129655fbe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.778 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.778 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.779 229250 DEBUG nova.virt.libvirt.vif [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T10:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-853218052',display_name='tempest-TestNetworkBasicOps-server-853218052',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-853218052',id=7,image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGL7vrn/akoHm9oxpuwHP+G/0FFkmDNLMfz6CI4NHhMnbsIbZkr92wnPjE4LFBAK7gBqIQZLYPOJYhqmy5T4tOaqFBwfEmlBVOAhU3ks9etlTi/suReVUCKKU9J6iWJ42g==',key_name='tempest-TestNetworkBasicOps-2140451550',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='92b402c8d3e2476abc98be42a1e6d34e',ramdisk_id='',reservation_id='r-zgkq310b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1971100882',owner_user_name='tempest-TestNetworkBasicOps-1971100882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:11:00Z,user_data=None,user_id='03615580775245e6ae335ee9d785611f',uuid=000b9af7-febc-46b4-801c-39d129655fbe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "address": "fa:16:3e:a1:92:1a", "network": {"id": "af11da89-c29d-4ef1-80d5-4b619757b0ff", "bridge": "br-int", "label": "tempest-network-smoke--2039147327", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39cc9025-b8", "ovs_interfaceid": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.779 229250 DEBUG nova.network.os_vif_util [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converting VIF {"id": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "address": "fa:16:3e:a1:92:1a", "network": {"id": "af11da89-c29d-4ef1-80d5-4b619757b0ff", "bridge": "br-int", "label": "tempest-network-smoke--2039147327", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39cc9025-b8", "ovs_interfaceid": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.780 229250 DEBUG nova.network.os_vif_util [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:92:1a,bridge_name='br-int',has_traffic_filtering=True,id=39cc9025-b8be-492b-82cf-765dd17ecbc2,network=Network(af11da89-c29d-4ef1-80d5-4b619757b0ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39cc9025-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.780 229250 DEBUG os_vif [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:92:1a,bridge_name='br-int',has_traffic_filtering=True,id=39cc9025-b8be-492b-82cf-765dd17ecbc2,network=Network(af11da89-c29d-4ef1-80d5-4b619757b0ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39cc9025-b8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.781 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.781 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.782 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.785 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.786 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap39cc9025-b8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.786 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap39cc9025-b8, col_values=(('external_ids', {'iface-id': '39cc9025-b8be-492b-82cf-765dd17ecbc2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a1:92:1a', 'vm-uuid': '000b9af7-febc-46b4-801c-39d129655fbe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.826 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:05 np0005548918 NetworkManager[48884]: <info>  [1765015865.8280] manager: (tap39cc9025-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.831 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.834 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.835 229250 INFO os_vif [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:92:1a,bridge_name='br-int',has_traffic_filtering=True,id=39cc9025-b8be-492b-82cf-765dd17ecbc2,network=Network(af11da89-c29d-4ef1-80d5-4b619757b0ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39cc9025-b8')#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.898 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.899 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.899 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] No VIF found with MAC fa:16:3e:a1:92:1a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.900 229250 INFO nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Using config drive#033[00m
Dec  6 05:11:05 np0005548918 nova_compute[229246]: 2025-12-06 10:11:05.929 229250 DEBUG nova.storage.rbd_utils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image 000b9af7-febc-46b4-801c-39d129655fbe_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:11:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:06 np0005548918 nova_compute[229246]: 2025-12-06 10:11:06.010 229250 DEBUG nova.network.neutron [req-be12d95c-9a47-4f7a-a2bd-4a6de84eef60 req-c501b821-7d44-40e3-8248-234e47a8b4b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Updated VIF entry in instance network info cache for port 39cc9025-b8be-492b-82cf-765dd17ecbc2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  6 05:11:06 np0005548918 nova_compute[229246]: 2025-12-06 10:11:06.010 229250 DEBUG nova.network.neutron [req-be12d95c-9a47-4f7a-a2bd-4a6de84eef60 req-c501b821-7d44-40e3-8248-234e47a8b4b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Updating instance_info_cache with network_info: [{"id": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "address": "fa:16:3e:a1:92:1a", "network": {"id": "af11da89-c29d-4ef1-80d5-4b619757b0ff", "bridge": "br-int", "label": "tempest-network-smoke--2039147327", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39cc9025-b8", "ovs_interfaceid": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:11:06 np0005548918 nova_compute[229246]: 2025-12-06 10:11:06.028 229250 DEBUG oslo_concurrency.lockutils [req-be12d95c-9a47-4f7a-a2bd-4a6de84eef60 req-c501b821-7d44-40e3-8248-234e47a8b4b4 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Releasing lock "refresh_cache-000b9af7-febc-46b4-801c-39d129655fbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:11:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:06 np0005548918 nova_compute[229246]: 2025-12-06 10:11:06.375 229250 INFO nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Creating config drive at /var/lib/nova/instances/000b9af7-febc-46b4-801c-39d129655fbe/disk.config#033[00m
Dec  6 05:11:06 np0005548918 nova_compute[229246]: 2025-12-06 10:11:06.384 229250 DEBUG oslo_concurrency.processutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/000b9af7-febc-46b4-801c-39d129655fbe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu4bux_y0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:11:06 np0005548918 nova_compute[229246]: 2025-12-06 10:11:06.532 229250 DEBUG oslo_concurrency.processutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/000b9af7-febc-46b4-801c-39d129655fbe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu4bux_y0" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:11:06 np0005548918 nova_compute[229246]: 2025-12-06 10:11:06.560 229250 DEBUG nova.storage.rbd_utils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image 000b9af7-febc-46b4-801c-39d129655fbe_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:11:06 np0005548918 nova_compute[229246]: 2025-12-06 10:11:06.563 229250 DEBUG oslo_concurrency.processutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/000b9af7-febc-46b4-801c-39d129655fbe/disk.config 000b9af7-febc-46b4-801c-39d129655fbe_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:11:06 np0005548918 nova_compute[229246]: 2025-12-06 10:11:06.733 229250 DEBUG oslo_concurrency.processutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/000b9af7-febc-46b4-801c-39d129655fbe/disk.config 000b9af7-febc-46b4-801c-39d129655fbe_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.170s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:11:06 np0005548918 nova_compute[229246]: 2025-12-06 10:11:06.735 229250 INFO nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Deleting local config drive /var/lib/nova/instances/000b9af7-febc-46b4-801c-39d129655fbe/disk.config because it was imported into RBD.#033[00m
Dec  6 05:11:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:06.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:06 np0005548918 kernel: tap39cc9025-b8: entered promiscuous mode
Dec  6 05:11:06 np0005548918 NetworkManager[48884]: <info>  [1765015866.7948] manager: (tap39cc9025-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Dec  6 05:11:06 np0005548918 ovn_controller[132371]: 2025-12-06T10:11:06Z|00054|binding|INFO|Claiming lport 39cc9025-b8be-492b-82cf-765dd17ecbc2 for this chassis.
Dec  6 05:11:06 np0005548918 ovn_controller[132371]: 2025-12-06T10:11:06Z|00055|binding|INFO|39cc9025-b8be-492b-82cf-765dd17ecbc2: Claiming fa:16:3e:a1:92:1a 10.100.0.27
Dec  6 05:11:06 np0005548918 nova_compute[229246]: 2025-12-06 10:11:06.797 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:06.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:06.813 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:92:1a 10.100.0.27'], port_security=['fa:16:3e:a1:92:1a 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '000b9af7-febc-46b4-801c-39d129655fbe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af11da89-c29d-4ef1-80d5-4b619757b0ff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92b402c8d3e2476abc98be42a1e6d34e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5d7daa58-d281-4d6a-bf46-6774db6606b8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f5b6720-4878-43e8-9823-306ee6c3568e, chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], logical_port=39cc9025-b8be-492b-82cf-765dd17ecbc2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:11:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:06.815 141640 INFO neutron.agent.ovn.metadata.agent [-] Port 39cc9025-b8be-492b-82cf-765dd17ecbc2 in datapath af11da89-c29d-4ef1-80d5-4b619757b0ff bound to our chassis#033[00m
Dec  6 05:11:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:06.816 141640 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network af11da89-c29d-4ef1-80d5-4b619757b0ff#033[00m
Dec  6 05:11:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:06.830 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[8e9c4f45-290f-4e3e-a7a5-a071bbbfc3a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:06.832 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaf11da89-c1 in ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  6 05:11:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:06.833 233203 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaf11da89-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  6 05:11:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:06.833 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[ef9d931c-88dc-47bb-a3a5-b6b9fa641e83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:06.834 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[ddad5477-0108-40f2-9fd7-190e5d04f879]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:06 np0005548918 systemd-machined[192688]: New machine qemu-3-instance-00000007.
Dec  6 05:11:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:06.848 141754 DEBUG oslo.privsep.daemon [-] privsep: reply[903537f4-13d4-4966-8485-87b9dc95e76c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:06 np0005548918 ovn_controller[132371]: 2025-12-06T10:11:06Z|00056|binding|INFO|Setting lport 39cc9025-b8be-492b-82cf-765dd17ecbc2 ovn-installed in OVS
Dec  6 05:11:06 np0005548918 ovn_controller[132371]: 2025-12-06T10:11:06Z|00057|binding|INFO|Setting lport 39cc9025-b8be-492b-82cf-765dd17ecbc2 up in Southbound
Dec  6 05:11:06 np0005548918 nova_compute[229246]: 2025-12-06 10:11:06.919 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:06 np0005548918 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Dec  6 05:11:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:06.932 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[f4828a69-38cc-4e80-a4d3-ffeb2debca85]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:06 np0005548918 systemd-udevd[236536]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 05:11:06 np0005548918 NetworkManager[48884]: <info>  [1765015866.9503] device (tap39cc9025-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  6 05:11:06 np0005548918 NetworkManager[48884]: <info>  [1765015866.9510] device (tap39cc9025-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  6 05:11:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:06.963 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[294d6681-0c66-4364-afc7-d70803bdfb5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:06 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:06.968 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e91853-a2de-4a33-9808-d1b73b2b0157]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:06 np0005548918 NetworkManager[48884]: <info>  [1765015866.9697] manager: (tapaf11da89-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Dec  6 05:11:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.009 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[1b4cfd3f-c5eb-4472-b8f4-5a89b198359b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.012 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[78f4b20b-cede-4806-8fdc-e80963e33293]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:07 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:07 np0005548918 NetworkManager[48884]: <info>  [1765015867.0333] device (tapaf11da89-c0): carrier: link connected
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.039 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[cf530819-c0d4-47dd-9b5d-7d2c47110d85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.055 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[6ad52e6d-5cd1-4c75-95b3-fd498ef881bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaf11da89-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:fe:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424063, 'reachable_time': 16386, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236566, 'error': None, 'target': 'ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.071 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[75381238-225c-4583-ad9f-8d5514d7edf4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feae:fe2e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424063, 'tstamp': 424063}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236567, 'error': None, 'target': 'ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.087 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d1d520-cfb7-4fe7-bdac-09966144b255]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaf11da89-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:fe:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424063, 'reachable_time': 16386, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236568, 'error': None, 'target': 'ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.112 229250 DEBUG nova.compute.manager [req-f5204fdc-8f5e-414a-8bda-8ae6a26f721b req-11b1f619-ff18-4e5a-bec5-7a37152efbd2 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Received event network-vif-plugged-39cc9025-b8be-492b-82cf-765dd17ecbc2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.112 229250 DEBUG oslo_concurrency.lockutils [req-f5204fdc-8f5e-414a-8bda-8ae6a26f721b req-11b1f619-ff18-4e5a-bec5-7a37152efbd2 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "000b9af7-febc-46b4-801c-39d129655fbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.113 229250 DEBUG oslo_concurrency.lockutils [req-f5204fdc-8f5e-414a-8bda-8ae6a26f721b req-11b1f619-ff18-4e5a-bec5-7a37152efbd2 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.113 229250 DEBUG oslo_concurrency.lockutils [req-f5204fdc-8f5e-414a-8bda-8ae6a26f721b req-11b1f619-ff18-4e5a-bec5-7a37152efbd2 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.113 229250 DEBUG nova.compute.manager [req-f5204fdc-8f5e-414a-8bda-8ae6a26f721b req-11b1f619-ff18-4e5a-bec5-7a37152efbd2 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Processing event network-vif-plugged-39cc9025-b8be-492b-82cf-765dd17ecbc2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.125 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[ae8f4dc8-ba54-4acb-a109-526774a9fe4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.186 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[467841bf-8fd3-4a28-8378-3fd4b46d9bd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.188 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf11da89-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.189 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.189 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf11da89-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:11:07 np0005548918 kernel: tapaf11da89-c0: entered promiscuous mode
Dec  6 05:11:07 np0005548918 NetworkManager[48884]: <info>  [1765015867.1920] manager: (tapaf11da89-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.196 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaf11da89-c0, col_values=(('external_ids', {'iface-id': '11d93e6a-f3e6-434c-bb3f-39cb96f417cf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:11:07 np0005548918 ovn_controller[132371]: 2025-12-06T10:11:07Z|00058|binding|INFO|Releasing lport 11d93e6a-f3e6-434c-bb3f-39cb96f417cf from this chassis (sb_readonly=0)
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.202 141640 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/af11da89-c29d-4ef1-80d5-4b619757b0ff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/af11da89-c29d-4ef1-80d5-4b619757b0ff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.191 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.206 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[3bfb6903-d932-4c6d-93ed-3353045a1a4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.208 141640 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: global
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    log         /dev/log local0 debug
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    log-tag     haproxy-metadata-proxy-af11da89-c29d-4ef1-80d5-4b619757b0ff
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    user        root
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    group       root
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    maxconn     1024
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    pidfile     /var/lib/neutron/external/pids/af11da89-c29d-4ef1-80d5-4b619757b0ff.pid.haproxy
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    daemon
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: defaults
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    log global
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    mode http
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    option httplog
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    option dontlognull
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    option http-server-close
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    option forwardfor
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    retries                 3
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    timeout http-request    30s
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    timeout connect         30s
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    timeout client          32s
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    timeout server          32s
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    timeout http-keep-alive 30s
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: listen listener
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    bind 169.254.169.254:80
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    server metadata /var/lib/neutron/metadata_proxy
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]:    http-request add-header X-OVN-Network-ID af11da89-c29d-4ef1-80d5-4b619757b0ff
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  6 05:11:07 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:07.209 141640 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff', 'env', 'PROCESS_TAG=haproxy-af11da89-c29d-4ef1-80d5-4b619757b0ff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/af11da89-c29d-4ef1-80d5-4b619757b0ff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.220 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:07 np0005548918 podman[236601]: 2025-12-06 10:11:07.579198195 +0000 UTC m=+0.046671315 container create 961a385544c1555569ca00a15c9d3d82627e9bc722e61b33f766c0ecabea1c12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec  6 05:11:07 np0005548918 systemd[1]: Started libpod-conmon-961a385544c1555569ca00a15c9d3d82627e9bc722e61b33f766c0ecabea1c12.scope.
Dec  6 05:11:07 np0005548918 podman[236601]: 2025-12-06 10:11:07.554007203 +0000 UTC m=+0.021480343 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3
Dec  6 05:11:07 np0005548918 systemd[1]: Started libcrun container.
Dec  6 05:11:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:07 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:07 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:07 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94f7c4219cb2dda889a6811e9ad21940ffcdc73dc9177050f41b12b5e6ad3730/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.669 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:07 np0005548918 podman[236601]: 2025-12-06 10:11:07.68650261 +0000 UTC m=+0.153975750 container init 961a385544c1555569ca00a15c9d3d82627e9bc722e61b33f766c0ecabea1c12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:11:07 np0005548918 podman[236601]: 2025-12-06 10:11:07.694875886 +0000 UTC m=+0.162349006 container start 961a385544c1555569ca00a15c9d3d82627e9bc722e61b33f766c0ecabea1c12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.708 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Triggering sync for uuid 000b9af7-febc-46b4-801c-39d129655fbe _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.709 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "000b9af7-febc-46b4-801c-39d129655fbe" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:07 np0005548918 neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff[236630]: [NOTICE]   (236653) : New worker (236658) forked
Dec  6 05:11:07 np0005548918 neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff[236630]: [NOTICE]   (236653) : Loading success.
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.823 229250 DEBUG nova.virt.driver [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Emitting event <LifecycleEvent: 1765015867.8223722, 000b9af7-febc-46b4-801c-39d129655fbe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.823 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] VM Started (Lifecycle Event)#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.826 229250 DEBUG nova.compute.manager [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.831 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.836 229250 INFO nova.virt.libvirt.driver [-] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Instance spawned successfully.#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.837 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.840 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.845 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.858 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.859 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.860 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.860 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.860 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.861 229250 DEBUG nova.virt.libvirt.driver [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.866 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.867 229250 DEBUG nova.virt.driver [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Emitting event <LifecycleEvent: 1765015867.8235016, 000b9af7-febc-46b4-801c-39d129655fbe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.867 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] VM Paused (Lifecycle Event)#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.903 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.907 229250 DEBUG nova.virt.driver [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Emitting event <LifecycleEvent: 1765015867.8304546, 000b9af7-febc-46b4-801c-39d129655fbe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.908 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] VM Resumed (Lifecycle Event)#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.929 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.934 229250 INFO nova.compute.manager [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Took 6.92 seconds to spawn the instance on the hypervisor.#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.934 229250 DEBUG nova.compute.manager [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.936 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  6 05:11:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:07 np0005548918 nova_compute[229246]: 2025-12-06 10:11:07.980 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  6 05:11:08 np0005548918 nova_compute[229246]: 2025-12-06 10:11:08.033 229250 INFO nova.compute.manager [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Took 7.97 seconds to build instance.#033[00m
Dec  6 05:11:08 np0005548918 nova_compute[229246]: 2025-12-06 10:11:08.055 229250 DEBUG oslo_concurrency.lockutils [None req-4619b365-ce7d-49a3-8363-6b1204cccf80 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:08 np0005548918 nova_compute[229246]: 2025-12-06 10:11:08.056 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "000b9af7-febc-46b4-801c-39d129655fbe" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:08 np0005548918 nova_compute[229246]: 2025-12-06 10:11:08.056 229250 INFO nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  6 05:11:08 np0005548918 nova_compute[229246]: 2025-12-06 10:11:08.056 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "000b9af7-febc-46b4-801c-39d129655fbe" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:08.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:11:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:08.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:11:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:09 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:09 np0005548918 nova_compute[229246]: 2025-12-06 10:11:09.244 229250 DEBUG nova.compute.manager [req-63f5af1c-bfb4-4189-aa62-362d5c8fc7a7 req-3d951cd7-5052-4438-a256-0ae4d0c980c1 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Received event network-vif-plugged-39cc9025-b8be-492b-82cf-765dd17ecbc2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:11:09 np0005548918 nova_compute[229246]: 2025-12-06 10:11:09.245 229250 DEBUG oslo_concurrency.lockutils [req-63f5af1c-bfb4-4189-aa62-362d5c8fc7a7 req-3d951cd7-5052-4438-a256-0ae4d0c980c1 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "000b9af7-febc-46b4-801c-39d129655fbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:09 np0005548918 nova_compute[229246]: 2025-12-06 10:11:09.245 229250 DEBUG oslo_concurrency.lockutils [req-63f5af1c-bfb4-4189-aa62-362d5c8fc7a7 req-3d951cd7-5052-4438-a256-0ae4d0c980c1 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:09 np0005548918 nova_compute[229246]: 2025-12-06 10:11:09.245 229250 DEBUG oslo_concurrency.lockutils [req-63f5af1c-bfb4-4189-aa62-362d5c8fc7a7 req-3d951cd7-5052-4438-a256-0ae4d0c980c1 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:09 np0005548918 nova_compute[229246]: 2025-12-06 10:11:09.245 229250 DEBUG nova.compute.manager [req-63f5af1c-bfb4-4189-aa62-362d5c8fc7a7 req-3d951cd7-5052-4438-a256-0ae4d0c980c1 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] No waiting events found dispatching network-vif-plugged-39cc9025-b8be-492b-82cf-765dd17ecbc2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:11:09 np0005548918 nova_compute[229246]: 2025-12-06 10:11:09.245 229250 WARNING nova.compute.manager [req-63f5af1c-bfb4-4189-aa62-362d5c8fc7a7 req-3d951cd7-5052-4438-a256-0ae4d0c980c1 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Received unexpected event network-vif-plugged-39cc9025-b8be-492b-82cf-765dd17ecbc2 for instance with vm_state active and task_state None.#033[00m
Dec  6 05:11:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:09 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:09 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:09 np0005548918 nova_compute[229246]: 2025-12-06 10:11:09.745 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:10.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:10.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:10 np0005548918 nova_compute[229246]: 2025-12-06 10:11:10.826 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:11 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:11 np0005548918 podman[236676]: 2025-12-06 10:11:11.191082956 +0000 UTC m=+0.082319569 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec  6 05:11:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:11 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:11 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:12.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:12.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:13 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:13 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:13 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:14.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:14 np0005548918 nova_compute[229246]: 2025-12-06 10:11:14.807 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:14.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:15 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:15 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:15 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc35c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:15 np0005548918 nova_compute[229246]: 2025-12-06 10:11:15.869 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:16.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:16.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:17 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:17 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:17 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:11:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:18.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:11:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:11:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:18.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:11:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:19 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3900089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:19 np0005548918 podman[236734]: 2025-12-06 10:11:19.173534077 +0000 UTC m=+0.058750961 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:11:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:19 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:19 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:19 np0005548918 nova_compute[229246]: 2025-12-06 10:11:19.844 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:19 np0005548918 ovn_controller[132371]: 2025-12-06T10:11:19Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a1:92:1a 10.100.0.27
Dec  6 05:11:19 np0005548918 ovn_controller[132371]: 2025-12-06T10:11:19Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a1:92:1a 10.100.0.27
Dec  6 05:11:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:20.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:20.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:20 np0005548918 nova_compute[229246]: 2025-12-06 10:11:20.919 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:21 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:21 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc3900089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:21 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:11:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:22.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:11:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:22.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:23 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:23 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:23 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:11:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:24.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:11:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:24.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:24 np0005548918 nova_compute[229246]: 2025-12-06 10:11:24.848 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:25 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c001ac0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:25 np0005548918 podman[236760]: 2025-12-06 10:11:25.186725467 +0000 UTC m=+0.075087844 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  6 05:11:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:25 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:25 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:25 np0005548918 nova_compute[229246]: 2025-12-06 10:11:25.922 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.002000053s ======
Dec  6 05:11:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:26.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Dec  6 05:11:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:26.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:27 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:27 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c001ac0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:27 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:28.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:28.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:29 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:29 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc350004460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:29 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c001ac0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:29 np0005548918 nova_compute[229246]: 2025-12-06 10:11:29.850 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:30.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:30.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:30 np0005548918 nova_compute[229246]: 2025-12-06 10:11:30.925 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:31 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:31 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:31 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.567 229250 DEBUG oslo_concurrency.lockutils [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "000b9af7-febc-46b4-801c-39d129655fbe" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.569 229250 DEBUG oslo_concurrency.lockutils [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.569 229250 DEBUG oslo_concurrency.lockutils [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "000b9af7-febc-46b4-801c-39d129655fbe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.569 229250 DEBUG oslo_concurrency.lockutils [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.569 229250 DEBUG oslo_concurrency.lockutils [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.570 229250 INFO nova.compute.manager [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Terminating instance#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.571 229250 DEBUG nova.compute.manager [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  6 05:11:32 np0005548918 kernel: tap39cc9025-b8 (unregistering): left promiscuous mode
Dec  6 05:11:32 np0005548918 NetworkManager[48884]: <info>  [1765015892.6310] device (tap39cc9025-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  6 05:11:32 np0005548918 ovn_controller[132371]: 2025-12-06T10:11:32Z|00059|binding|INFO|Releasing lport 39cc9025-b8be-492b-82cf-765dd17ecbc2 from this chassis (sb_readonly=0)
Dec  6 05:11:32 np0005548918 ovn_controller[132371]: 2025-12-06T10:11:32Z|00060|binding|INFO|Setting lport 39cc9025-b8be-492b-82cf-765dd17ecbc2 down in Southbound
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.640 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:32 np0005548918 ovn_controller[132371]: 2025-12-06T10:11:32Z|00061|binding|INFO|Removing iface tap39cc9025-b8 ovn-installed in OVS
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.643 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:32 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:32.649 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:92:1a 10.100.0.27'], port_security=['fa:16:3e:a1:92:1a 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '000b9af7-febc-46b4-801c-39d129655fbe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af11da89-c29d-4ef1-80d5-4b619757b0ff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92b402c8d3e2476abc98be42a1e6d34e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5d7daa58-d281-4d6a-bf46-6774db6606b8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f5b6720-4878-43e8-9823-306ee6c3568e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], logical_port=39cc9025-b8be-492b-82cf-765dd17ecbc2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:11:32 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:32.651 141640 INFO neutron.agent.ovn.metadata.agent [-] Port 39cc9025-b8be-492b-82cf-765dd17ecbc2 in datapath af11da89-c29d-4ef1-80d5-4b619757b0ff unbound from our chassis#033[00m
Dec  6 05:11:32 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:32.652 141640 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network af11da89-c29d-4ef1-80d5-4b619757b0ff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  6 05:11:32 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:32.653 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[1499e3af-2fb2-4c18-aa9b-febf359cd829]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:32 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:32.653 141640 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff namespace which is not needed anymore#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.661 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:32 np0005548918 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Dec  6 05:11:32 np0005548918 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 14.050s CPU time.
Dec  6 05:11:32 np0005548918 systemd-machined[192688]: Machine qemu-3-instance-00000007 terminated.
Dec  6 05:11:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:32.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.809 229250 INFO nova.virt.libvirt.driver [-] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Instance destroyed successfully.#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.809 229250 DEBUG nova.objects.instance [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lazy-loading 'resources' on Instance uuid 000b9af7-febc-46b4-801c-39d129655fbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  6 05:11:32 np0005548918 neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff[236630]: [NOTICE]   (236653) : haproxy version is 2.8.14-c23fe91
Dec  6 05:11:32 np0005548918 neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff[236630]: [NOTICE]   (236653) : path to executable is /usr/sbin/haproxy
Dec  6 05:11:32 np0005548918 neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff[236630]: [WARNING]  (236653) : Exiting Master process...
Dec  6 05:11:32 np0005548918 neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff[236630]: [ALERT]    (236653) : Current worker (236658) exited with code 143 (Terminated)
Dec  6 05:11:32 np0005548918 neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff[236630]: [WARNING]  (236653) : All workers exited. Exiting... (0)
Dec  6 05:11:32 np0005548918 systemd[1]: libpod-961a385544c1555569ca00a15c9d3d82627e9bc722e61b33f766c0ecabea1c12.scope: Deactivated successfully.
Dec  6 05:11:32 np0005548918 podman[236948]: 2025-12-06 10:11:32.826763096 +0000 UTC m=+0.070533750 container died 961a385544c1555569ca00a15c9d3d82627e9bc722e61b33f766c0ecabea1c12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.826 229250 DEBUG nova.virt.libvirt.vif [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-06T10:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-853218052',display_name='tempest-TestNetworkBasicOps-server-853218052',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-853218052',id=7,image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGL7vrn/akoHm9oxpuwHP+G/0FFkmDNLMfz6CI4NHhMnbsIbZkr92wnPjE4LFBAK7gBqIQZLYPOJYhqmy5T4tOaqFBwfEmlBVOAhU3ks9etlTi/suReVUCKKU9J6iWJ42g==',key_name='tempest-TestNetworkBasicOps-2140451550',keypairs=<?>,launch_index=0,launched_at=2025-12-06T10:11:07Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='92b402c8d3e2476abc98be42a1e6d34e',ramdisk_id='',reservation_id='r-zgkq310b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1971100882',owner_user_name='tempest-TestNetworkBasicOps-1971100882-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T10:11:07Z,user_data=None,user_id='03615580775245e6ae335ee9d785611f',uuid=000b9af7-febc-46b4-801c-39d129655fbe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "address": "fa:16:3e:a1:92:1a", "network": {"id": "af11da89-c29d-4ef1-80d5-4b619757b0ff", "bridge": "br-int", "label": "tempest-network-smoke--2039147327", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39cc9025-b8", "ovs_interfaceid": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.827 229250 DEBUG nova.network.os_vif_util [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converting VIF {"id": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "address": "fa:16:3e:a1:92:1a", "network": {"id": "af11da89-c29d-4ef1-80d5-4b619757b0ff", "bridge": "br-int", "label": "tempest-network-smoke--2039147327", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39cc9025-b8", "ovs_interfaceid": "39cc9025-b8be-492b-82cf-765dd17ecbc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.828 229250 DEBUG nova.network.os_vif_util [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:92:1a,bridge_name='br-int',has_traffic_filtering=True,id=39cc9025-b8be-492b-82cf-765dd17ecbc2,network=Network(af11da89-c29d-4ef1-80d5-4b619757b0ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39cc9025-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.828 229250 DEBUG os_vif [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:92:1a,bridge_name='br-int',has_traffic_filtering=True,id=39cc9025-b8be-492b-82cf-765dd17ecbc2,network=Network(af11da89-c29d-4ef1-80d5-4b619757b0ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39cc9025-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.830 229250 DEBUG nova.compute.manager [req-13ff4731-f246-4ff7-ab63-3efaa5ba75de req-4f1eb10e-ae23-40b4-b598-eefb63d490d0 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Received event network-vif-unplugged-39cc9025-b8be-492b-82cf-765dd17ecbc2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.831 229250 DEBUG oslo_concurrency.lockutils [req-13ff4731-f246-4ff7-ab63-3efaa5ba75de req-4f1eb10e-ae23-40b4-b598-eefb63d490d0 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "000b9af7-febc-46b4-801c-39d129655fbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.831 229250 DEBUG oslo_concurrency.lockutils [req-13ff4731-f246-4ff7-ab63-3efaa5ba75de req-4f1eb10e-ae23-40b4-b598-eefb63d490d0 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.831 229250 DEBUG oslo_concurrency.lockutils [req-13ff4731-f246-4ff7-ab63-3efaa5ba75de req-4f1eb10e-ae23-40b4-b598-eefb63d490d0 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.831 229250 DEBUG nova.compute.manager [req-13ff4731-f246-4ff7-ab63-3efaa5ba75de req-4f1eb10e-ae23-40b4-b598-eefb63d490d0 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] No waiting events found dispatching network-vif-unplugged-39cc9025-b8be-492b-82cf-765dd17ecbc2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.831 229250 DEBUG nova.compute.manager [req-13ff4731-f246-4ff7-ab63-3efaa5ba75de req-4f1eb10e-ae23-40b4-b598-eefb63d490d0 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Received event network-vif-unplugged-39cc9025-b8be-492b-82cf-765dd17ecbc2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.832 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.832 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39cc9025-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.833 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:11:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:32.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.835 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.838 229250 INFO os_vif [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:92:1a,bridge_name='br-int',has_traffic_filtering=True,id=39cc9025-b8be-492b-82cf-765dd17ecbc2,network=Network(af11da89-c29d-4ef1-80d5-4b619757b0ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39cc9025-b8')#033[00m
Dec  6 05:11:32 np0005548918 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-961a385544c1555569ca00a15c9d3d82627e9bc722e61b33f766c0ecabea1c12-userdata-shm.mount: Deactivated successfully.
Dec  6 05:11:32 np0005548918 systemd[1]: var-lib-containers-storage-overlay-94f7c4219cb2dda889a6811e9ad21940ffcdc73dc9177050f41b12b5e6ad3730-merged.mount: Deactivated successfully.
Dec  6 05:11:32 np0005548918 podman[236948]: 2025-12-06 10:11:32.873239005 +0000 UTC m=+0.117009659 container cleanup 961a385544c1555569ca00a15c9d3d82627e9bc722e61b33f766c0ecabea1c12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  6 05:11:32 np0005548918 systemd[1]: libpod-conmon-961a385544c1555569ca00a15c9d3d82627e9bc722e61b33f766c0ecabea1c12.scope: Deactivated successfully.
Dec  6 05:11:32 np0005548918 podman[237031]: 2025-12-06 10:11:32.955768359 +0000 UTC m=+0.061642540 container remove 961a385544c1555569ca00a15c9d3d82627e9bc722e61b33f766c0ecabea1c12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  6 05:11:32 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:32.963 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[71590c35-a990-43e9-ae7a-2188f10abce6]: (4, ('Sat Dec  6 10:11:32 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff (961a385544c1555569ca00a15c9d3d82627e9bc722e61b33f766c0ecabea1c12)\n961a385544c1555569ca00a15c9d3d82627e9bc722e61b33f766c0ecabea1c12\nSat Dec  6 10:11:32 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff (961a385544c1555569ca00a15c9d3d82627e9bc722e61b33f766c0ecabea1c12)\n961a385544c1555569ca00a15c9d3d82627e9bc722e61b33f766c0ecabea1c12\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:32 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:32.965 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd617ae-af87-43b0-9537-582332ec6182]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:32 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:32.966 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf11da89-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:11:32 np0005548918 kernel: tapaf11da89-c0: left promiscuous mode
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.968 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:32 np0005548918 nova_compute[229246]: 2025-12-06 10:11:32.981 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:32 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:32.984 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[a44bb44e-7428-42ef-a9a7-b992bc384ffc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:32 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:32.997 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[0e0c647c-5731-413d-b0b3-b2a842174c3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:33 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:32.999 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[596c5659-0e78-433d-a89f-b3cc6d547956]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:33 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:33.014 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[57f82e69-04c3-498f-b7b1-d7253360e8bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424056, 'reachable_time': 21440, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237091, 'error': None, 'target': 'ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:33 np0005548918 systemd[1]: run-netns-ovnmeta\x2daf11da89\x2dc29d\x2d4ef1\x2d80d5\x2d4b619757b0ff.mount: Deactivated successfully.
Dec  6 05:11:33 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:33.018 141754 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-af11da89-c29d-4ef1-80d5-4b619757b0ff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  6 05:11:33 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:33.018 141754 DEBUG oslo.privsep.daemon [-] privsep: reply[f1598c75-4b89-4c57-aea5-5d9c31f4e260]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:11:33 np0005548918 podman[237081]: 2025-12-06 10:11:33.041357246 +0000 UTC m=+0.044423034 container create 12180da64500c87376048208bb42470a1e922c247caa4d34fc1e8551541fb54a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_almeida, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 05:11:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:33 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:33 np0005548918 systemd[1]: Started libpod-conmon-12180da64500c87376048208bb42470a1e922c247caa4d34fc1e8551541fb54a.scope.
Dec  6 05:11:33 np0005548918 systemd[1]: Started libcrun container.
Dec  6 05:11:33 np0005548918 podman[237081]: 2025-12-06 10:11:33.026596447 +0000 UTC m=+0.029662255 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 05:11:33 np0005548918 podman[237081]: 2025-12-06 10:11:33.130869769 +0000 UTC m=+0.133935577 container init 12180da64500c87376048208bb42470a1e922c247caa4d34fc1e8551541fb54a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_almeida, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  6 05:11:33 np0005548918 podman[237081]: 2025-12-06 10:11:33.138628 +0000 UTC m=+0.141693798 container start 12180da64500c87376048208bb42470a1e922c247caa4d34fc1e8551541fb54a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_almeida, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  6 05:11:33 np0005548918 podman[237081]: 2025-12-06 10:11:33.144071297 +0000 UTC m=+0.147137105 container attach 12180da64500c87376048208bb42470a1e922c247caa4d34fc1e8551541fb54a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_almeida, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  6 05:11:33 np0005548918 relaxed_almeida[237102]: 167 167
Dec  6 05:11:33 np0005548918 systemd[1]: libpod-12180da64500c87376048208bb42470a1e922c247caa4d34fc1e8551541fb54a.scope: Deactivated successfully.
Dec  6 05:11:33 np0005548918 podman[237081]: 2025-12-06 10:11:33.146717349 +0000 UTC m=+0.149783157 container died 12180da64500c87376048208bb42470a1e922c247caa4d34fc1e8551541fb54a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_almeida, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 05:11:33 np0005548918 systemd[1]: var-lib-containers-storage-overlay-2e48677ed940a66f087d0fce46ad82da319f0d5bc91619a4dd82902de1d963b9-merged.mount: Deactivated successfully.
Dec  6 05:11:33 np0005548918 podman[237081]: 2025-12-06 10:11:33.186760193 +0000 UTC m=+0.189825991 container remove 12180da64500c87376048208bb42470a1e922c247caa4d34fc1e8551541fb54a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_almeida, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Dec  6 05:11:33 np0005548918 systemd[1]: libpod-conmon-12180da64500c87376048208bb42470a1e922c247caa4d34fc1e8551541fb54a.scope: Deactivated successfully.
Dec  6 05:11:33 np0005548918 nova_compute[229246]: 2025-12-06 10:11:33.319 229250 INFO nova.virt.libvirt.driver [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Deleting instance files /var/lib/nova/instances/000b9af7-febc-46b4-801c-39d129655fbe_del#033[00m
Dec  6 05:11:33 np0005548918 nova_compute[229246]: 2025-12-06 10:11:33.320 229250 INFO nova.virt.libvirt.driver [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Deletion of /var/lib/nova/instances/000b9af7-febc-46b4-801c-39d129655fbe_del complete#033[00m
Dec  6 05:11:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:33 np0005548918 nova_compute[229246]: 2025-12-06 10:11:33.398 229250 INFO nova.compute.manager [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Took 0.83 seconds to destroy the instance on the hypervisor.#033[00m
Dec  6 05:11:33 np0005548918 nova_compute[229246]: 2025-12-06 10:11:33.399 229250 DEBUG oslo.service.loopingcall [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  6 05:11:33 np0005548918 nova_compute[229246]: 2025-12-06 10:11:33.399 229250 DEBUG nova.compute.manager [-] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  6 05:11:33 np0005548918 nova_compute[229246]: 2025-12-06 10:11:33.399 229250 DEBUG nova.network.neutron [-] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  6 05:11:33 np0005548918 podman[237127]: 2025-12-06 10:11:33.428307182 +0000 UTC m=+0.059312857 container create a66057d9ef5601f42a21320db87f6658480124d63ee9a13051f3e457148ceb57 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_sanderson, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Dec  6 05:11:33 np0005548918 systemd[1]: Started libpod-conmon-a66057d9ef5601f42a21320db87f6658480124d63ee9a13051f3e457148ceb57.scope.
Dec  6 05:11:33 np0005548918 podman[237127]: 2025-12-06 10:11:33.404716863 +0000 UTC m=+0.035722558 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 05:11:33 np0005548918 systemd[1]: Started libcrun container.
Dec  6 05:11:33 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9608ee93c04650902f73e1827a4430d57d1b5b5f0669be3d2e9f55e9f738294/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  6 05:11:33 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9608ee93c04650902f73e1827a4430d57d1b5b5f0669be3d2e9f55e9f738294/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  6 05:11:33 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9608ee93c04650902f73e1827a4430d57d1b5b5f0669be3d2e9f55e9f738294/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 05:11:33 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9608ee93c04650902f73e1827a4430d57d1b5b5f0669be3d2e9f55e9f738294/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  6 05:11:33 np0005548918 podman[237127]: 2025-12-06 10:11:33.534690312 +0000 UTC m=+0.165696007 container init a66057d9ef5601f42a21320db87f6658480124d63ee9a13051f3e457148ceb57 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_sanderson, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  6 05:11:33 np0005548918 podman[237127]: 2025-12-06 10:11:33.54387166 +0000 UTC m=+0.174877325 container start a66057d9ef5601f42a21320db87f6658480124d63ee9a13051f3e457148ceb57 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_sanderson, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Dec  6 05:11:33 np0005548918 podman[237127]: 2025-12-06 10:11:33.546992504 +0000 UTC m=+0.177998189 container attach a66057d9ef5601f42a21320db87f6658480124d63ee9a13051f3e457148ceb57 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_sanderson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 05:11:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:33 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:33 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c001ac0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]: [
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:    {
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:        "available": false,
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:        "being_replaced": false,
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:        "ceph_device_lvm": false,
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:        "device_id": "QEMU_DVD-ROM_QM00001",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:        "lsm_data": {},
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:        "lvs": [],
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:        "path": "/dev/sr0",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:        "rejected_reasons": [
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "Insufficient space (<5GB)",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "Has a FileSystem"
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:        ],
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:        "sys_api": {
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "actuators": null,
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "device_nodes": [
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:                "sr0"
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            ],
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "devname": "sr0",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "human_readable_size": "482.00 KB",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "id_bus": "ata",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "model": "QEMU DVD-ROM",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "nr_requests": "2",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "parent": "/dev/sr0",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "partitions": {},
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "path": "/dev/sr0",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "removable": "1",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "rev": "2.5+",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "ro": "0",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "rotational": "1",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "sas_address": "",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "sas_device_handle": "",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "scheduler_mode": "mq-deadline",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "sectors": 0,
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "sectorsize": "2048",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "size": 493568.0,
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "support_discard": "2048",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "type": "disk",
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:            "vendor": "QEMU"
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:        }
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]:    }
Dec  6 05:11:34 np0005548918 sleepy_sanderson[237144]: ]
Dec  6 05:11:34 np0005548918 systemd[1]: libpod-a66057d9ef5601f42a21320db87f6658480124d63ee9a13051f3e457148ceb57.scope: Deactivated successfully.
Dec  6 05:11:34 np0005548918 podman[237127]: 2025-12-06 10:11:34.349802158 +0000 UTC m=+0.980807813 container died a66057d9ef5601f42a21320db87f6658480124d63ee9a13051f3e457148ceb57 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_sanderson, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default)
Dec  6 05:11:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:34 np0005548918 systemd[1]: var-lib-containers-storage-overlay-d9608ee93c04650902f73e1827a4430d57d1b5b5f0669be3d2e9f55e9f738294-merged.mount: Deactivated successfully.
Dec  6 05:11:34 np0005548918 podman[237127]: 2025-12-06 10:11:34.391632301 +0000 UTC m=+1.022637966 container remove a66057d9ef5601f42a21320db87f6658480124d63ee9a13051f3e457148ceb57 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_sanderson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec  6 05:11:34 np0005548918 systemd[1]: libpod-conmon-a66057d9ef5601f42a21320db87f6658480124d63ee9a13051f3e457148ceb57.scope: Deactivated successfully.
Dec  6 05:11:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:34.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:11:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:34.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:11:34 np0005548918 nova_compute[229246]: 2025-12-06 10:11:34.895 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:34 np0005548918 nova_compute[229246]: 2025-12-06 10:11:34.910 229250 DEBUG nova.network.neutron [-] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:11:34 np0005548918 nova_compute[229246]: 2025-12-06 10:11:34.929 229250 INFO nova.compute.manager [-] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Took 1.53 seconds to deallocate network for instance.#033[00m
Dec  6 05:11:34 np0005548918 nova_compute[229246]: 2025-12-06 10:11:34.939 229250 DEBUG nova.compute.manager [req-a2e4d8b2-6bd9-4abf-b5d7-c1ff9f3485b3 req-451758c8-b855-4928-bf3b-f285f695bea3 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Received event network-vif-plugged-39cc9025-b8be-492b-82cf-765dd17ecbc2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:11:34 np0005548918 nova_compute[229246]: 2025-12-06 10:11:34.939 229250 DEBUG oslo_concurrency.lockutils [req-a2e4d8b2-6bd9-4abf-b5d7-c1ff9f3485b3 req-451758c8-b855-4928-bf3b-f285f695bea3 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "000b9af7-febc-46b4-801c-39d129655fbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:34 np0005548918 nova_compute[229246]: 2025-12-06 10:11:34.939 229250 DEBUG oslo_concurrency.lockutils [req-a2e4d8b2-6bd9-4abf-b5d7-c1ff9f3485b3 req-451758c8-b855-4928-bf3b-f285f695bea3 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:34 np0005548918 nova_compute[229246]: 2025-12-06 10:11:34.940 229250 DEBUG oslo_concurrency.lockutils [req-a2e4d8b2-6bd9-4abf-b5d7-c1ff9f3485b3 req-451758c8-b855-4928-bf3b-f285f695bea3 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:34 np0005548918 nova_compute[229246]: 2025-12-06 10:11:34.940 229250 DEBUG nova.compute.manager [req-a2e4d8b2-6bd9-4abf-b5d7-c1ff9f3485b3 req-451758c8-b855-4928-bf3b-f285f695bea3 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] No waiting events found dispatching network-vif-plugged-39cc9025-b8be-492b-82cf-765dd17ecbc2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:11:34 np0005548918 nova_compute[229246]: 2025-12-06 10:11:34.940 229250 WARNING nova.compute.manager [req-a2e4d8b2-6bd9-4abf-b5d7-c1ff9f3485b3 req-451758c8-b855-4928-bf3b-f285f695bea3 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Received unexpected event network-vif-plugged-39cc9025-b8be-492b-82cf-765dd17ecbc2 for instance with vm_state active and task_state deleting.#033[00m
Dec  6 05:11:34 np0005548918 nova_compute[229246]: 2025-12-06 10:11:34.976 229250 DEBUG oslo_concurrency.lockutils [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:34 np0005548918 nova_compute[229246]: 2025-12-06 10:11:34.977 229250 DEBUG oslo_concurrency.lockutils [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:35 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:35 np0005548918 nova_compute[229246]: 2025-12-06 10:11:35.051 229250 DEBUG oslo_concurrency.processutils [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:11:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:11:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:11:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:11:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:11:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:11:35 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:11:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:11:35 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3176469227' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:11:35 np0005548918 nova_compute[229246]: 2025-12-06 10:11:35.507 229250 DEBUG oslo_concurrency.processutils [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:11:35 np0005548918 nova_compute[229246]: 2025-12-06 10:11:35.514 229250 DEBUG nova.compute.provider_tree [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:11:35 np0005548918 nova_compute[229246]: 2025-12-06 10:11:35.529 229250 DEBUG nova.scheduler.client.report [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:11:35 np0005548918 nova_compute[229246]: 2025-12-06 10:11:35.549 229250 DEBUG oslo_concurrency.lockutils [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:35 np0005548918 nova_compute[229246]: 2025-12-06 10:11:35.587 229250 INFO nova.scheduler.client.report [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Deleted allocations for instance 000b9af7-febc-46b4-801c-39d129655fbe#033[00m
Dec  6 05:11:35 np0005548918 nova_compute[229246]: 2025-12-06 10:11:35.660 229250 DEBUG oslo_concurrency.lockutils [None req-03afb330-3b60-4fb7-ab84-1e3aaf6a607c 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "000b9af7-febc-46b4-801c-39d129655fbe" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:35 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:35 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:36.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:36.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:36 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:36.856 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:11:36 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:36.857 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:11:36 np0005548918 nova_compute[229246]: 2025-12-06 10:11:36.857 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:37 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:37 np0005548918 nova_compute[229246]: 2025-12-06 10:11:37.187 229250 DEBUG nova.compute.manager [req-995f29bd-03d9-4e80-a5a4-7afb597572ff req-031f34f2-de2b-4841-ac76-e1138569eea1 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Received event network-vif-deleted-39cc9025-b8be-492b-82cf-765dd17ecbc2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:11:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:37 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:37 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354002600 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:37 np0005548918 nova_compute[229246]: 2025-12-06 10:11:37.835 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:38.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:38.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:39 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:39 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:11:39 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:11:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:39 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:39 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:39 np0005548918 nova_compute[229246]: 2025-12-06 10:11:39.897 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:40.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:11:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:40.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:11:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:41 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:41 np0005548918 nova_compute[229246]: 2025-12-06 10:11:41.576 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:41 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c001ac0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:41 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:41 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:41.859 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:11:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:42 np0005548918 podman[238564]: 2025-12-06 10:11:42.240174588 +0000 UTC m=+0.127472852 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  6 05:11:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:42 np0005548918 nova_compute[229246]: 2025-12-06 10:11:42.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:11:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:42.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:11:42 np0005548918 nova_compute[229246]: 2025-12-06 10:11:42.837 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:42.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:42 np0005548918 nova_compute[229246]: 2025-12-06 10:11:42.971 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:43 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:43 np0005548918 nova_compute[229246]: 2025-12-06 10:11:43.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:43 np0005548918 nova_compute[229246]: 2025-12-06 10:11:43.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:11:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:43 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c001ac0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:43 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:44.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:11:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:44.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:11:44 np0005548918 nova_compute[229246]: 2025-12-06 10:11:44.899 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:45 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:45 np0005548918 nova_compute[229246]: 2025-12-06 10:11:45.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:45 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c001ac0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:45 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354002ae0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:46.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:11:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:46.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:11:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:47 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:47 np0005548918 nova_compute[229246]: 2025-12-06 10:11:47.530 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:47 np0005548918 nova_compute[229246]: 2025-12-06 10:11:47.534 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:47 np0005548918 nova_compute[229246]: 2025-12-06 10:11:47.534 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:11:47 np0005548918 nova_compute[229246]: 2025-12-06 10:11:47.535 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:11:47 np0005548918 nova_compute[229246]: 2025-12-06 10:11:47.549 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:11:47 np0005548918 nova_compute[229246]: 2025-12-06 10:11:47.549 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:47 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:47 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:47 np0005548918 nova_compute[229246]: 2025-12-06 10:11:47.805 229250 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765015892.802533, 000b9af7-febc-46b4-801c-39d129655fbe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:11:47 np0005548918 nova_compute[229246]: 2025-12-06 10:11:47.806 229250 INFO nova.compute.manager [-] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] VM Stopped (Lifecycle Event)#033[00m
Dec  6 05:11:47 np0005548918 nova_compute[229246]: 2025-12-06 10:11:47.839 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:47 np0005548918 nova_compute[229246]: 2025-12-06 10:11:47.878 229250 DEBUG nova.compute.manager [None req-0a117eb6-937d-4e32-9624-11d638c9f3f9 - - - - - -] [instance: 000b9af7-febc-46b4-801c-39d129655fbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:11:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:48 np0005548918 nova_compute[229246]: 2025-12-06 10:11:48.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:48 np0005548918 nova_compute[229246]: 2025-12-06 10:11:48.568 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:48 np0005548918 nova_compute[229246]: 2025-12-06 10:11:48.569 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:48 np0005548918 nova_compute[229246]: 2025-12-06 10:11:48.569 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:48 np0005548918 nova_compute[229246]: 2025-12-06 10:11:48.569 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:11:48 np0005548918 nova_compute[229246]: 2025-12-06 10:11:48.569 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:11:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:11:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:48.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:11:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:48.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:48 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:11:48 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/468755846' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.010 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:11:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:49 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354002ae0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.216 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.217 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4832MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.217 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.218 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.301 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.301 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.323 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing inventories for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.371 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating ProviderTree inventory for provider 31f5f484-bf36-44de-83b8-7b434061a77b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.372 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating inventory in ProviderTree for provider 31f5f484-bf36-44de-83b8-7b434061a77b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.402 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing aggregate associations for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.439 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing trait associations for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE4A,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_ABM,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,HW_CPU_X86_AVX2,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.469 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:11:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:49 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c001ae0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:49 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:49 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:11:49 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1105957875' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.901 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.909 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.913 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.931 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.957 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:11:49 np0005548918 nova_compute[229246]: 2025-12-06 10:11:49.957 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.740s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:50 np0005548918 podman[238646]: 2025-12-06 10:11:50.179511663 +0000 UTC m=+0.068469555 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:11:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:11:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:50.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:11:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:11:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:50.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:11:50 np0005548918 nova_compute[229246]: 2025-12-06 10:11:50.958 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:11:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:51 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:51 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354002ae0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:51 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c001ae0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:11:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:52.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:11:52 np0005548918 nova_compute[229246]: 2025-12-06 10:11:52.840 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:52.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:53 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:53.680 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:11:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:53.681 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:11:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:11:53.682 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:11:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:53 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:53 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354002ae0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:11:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:54.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:11:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:54.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:54 np0005548918 nova_compute[229246]: 2025-12-06 10:11:54.903 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:55 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c001ae0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:11:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:55 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:55 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354002ae0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:56 np0005548918 podman[238696]: 2025-12-06 10:11:56.174486658 +0000 UTC m=+0.062320509 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Dec  6 05:11:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:56.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:56.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:57 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc368003140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:57 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c001ae0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:57 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:57 np0005548918 nova_compute[229246]: 2025-12-06 10:11:57.842 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:58.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:11:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:11:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:58.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:11:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:59 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354002ae0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:11:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:11:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:59 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc354002ae0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:11:59 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc38c001ae0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:11:59 np0005548918 nova_compute[229246]: 2025-12-06 10:11:59.905 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:11:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:11:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Dec  6 05:12:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Dec  6 05:12:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Dec  6 05:12:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Dec  6 05:12:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Dec  6 05:12:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Dec  6 05:12:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Dec  6 05:12:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Dec  6 05:12:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:12:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:00.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:12:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:00.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:12:01 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc380004b20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[233769]: 06/12/2025 10:12:01 : epoch 69340044 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc36800c280 fd 48 proxy ignored for local
Dec  6 05:12:01 np0005548918 kernel: ganesha.nfsd[236148]: segfault at 50 ip 00007fc43eb6932e sp 00007fc403ffe210 error 4 in libntirpc.so.5.8[7fc43eb4e000+2c000] likely on CPU 4 (core 0, socket 4)
Dec  6 05:12:01 np0005548918 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 05:12:01 np0005548918 systemd[1]: Started Process Core Dump (PID 238722/UID 0).
Dec  6 05:12:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:02 np0005548918 systemd-coredump[238723]: Process 233773 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 72:#012#0  0x00007fc43eb6932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 05:12:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:02.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:02 np0005548918 nova_compute[229246]: 2025-12-06 10:12:02.845 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:02 np0005548918 systemd[1]: systemd-coredump@7-238722-0.service: Deactivated successfully.
Dec  6 05:12:02 np0005548918 systemd[1]: systemd-coredump@7-238722-0.service: Consumed 1.043s CPU time.
Dec  6 05:12:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:02.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:02 np0005548918 podman[238729]: 2025-12-06 10:12:02.926502988 +0000 UTC m=+0.030098456 container died ac64860b815acf5b7beef1497de59798554a71947748d7def201dca58b1b4a6a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  6 05:12:02 np0005548918 systemd[1]: var-lib-containers-storage-overlay-d4289d5edb023c7dbe91d5e418954f3f6566f52155e4cf4fe497941c8d806b00-merged.mount: Deactivated successfully.
Dec  6 05:12:02 np0005548918 podman[238729]: 2025-12-06 10:12:02.983542962 +0000 UTC m=+0.087138420 container remove ac64860b815acf5b7beef1497de59798554a71947748d7def201dca58b1b4a6a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 05:12:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:02 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Main process exited, code=exited, status=139/n/a
Dec  6 05:12:03 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Failed with result 'exit-code'.
Dec  6 05:12:03 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.829s CPU time.
Dec  6 05:12:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:04.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:04.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:04 np0005548918 nova_compute[229246]: 2025-12-06 10:12:04.907 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:06.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:06.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/101207 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:12:07 np0005548918 nova_compute[229246]: 2025-12-06 10:12:07.847 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:08.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:08.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:09 np0005548918 nova_compute[229246]: 2025-12-06 10:12:09.942 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:10.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:10.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:12.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:12 np0005548918 nova_compute[229246]: 2025-12-06 10:12:12.849 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:12.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:13 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Scheduled restart job, restart counter is at 8.
Dec  6 05:12:13 np0005548918 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:12:13 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.829s CPU time.
Dec  6 05:12:13 np0005548918 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 05:12:13 np0005548918 podman[238782]: 2025-12-06 10:12:13.257753256 +0000 UTC m=+0.118364245 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec  6 05:12:13 np0005548918 podman[238878]: 2025-12-06 10:12:13.466144467 +0000 UTC m=+0.043902369 container create f5915d1f61f0eefc7354eaa2f368c1cf2cbfabbe32685522cce767b40ee657ad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid)
Dec  6 05:12:13 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81e0c7b5904a7a7a5613ddac362593e6279c221962a52de33a4e5021d555154c/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 05:12:13 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81e0c7b5904a7a7a5613ddac362593e6279c221962a52de33a4e5021d555154c/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:12:13 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81e0c7b5904a7a7a5613ddac362593e6279c221962a52de33a4e5021d555154c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 05:12:13 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81e0c7b5904a7a7a5613ddac362593e6279c221962a52de33a4e5021d555154c/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.sseuqb-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:12:13 np0005548918 podman[238878]: 2025-12-06 10:12:13.534926229 +0000 UTC m=+0.112684161 container init f5915d1f61f0eefc7354eaa2f368c1cf2cbfabbe32685522cce767b40ee657ad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 05:12:13 np0005548918 podman[238878]: 2025-12-06 10:12:13.448397617 +0000 UTC m=+0.026155539 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 05:12:13 np0005548918 podman[238878]: 2025-12-06 10:12:13.547644833 +0000 UTC m=+0.125402755 container start f5915d1f61f0eefc7354eaa2f368c1cf2cbfabbe32685522cce767b40ee657ad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Dec  6 05:12:13 np0005548918 bash[238878]: f5915d1f61f0eefc7354eaa2f368c1cf2cbfabbe32685522cce767b40ee657ad
Dec  6 05:12:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:13 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 05:12:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:13 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 05:12:13 np0005548918 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:12:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:13 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 05:12:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:13 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 05:12:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:13 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 05:12:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:13 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 05:12:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:13 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 05:12:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:13 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:12:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:14.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:14.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:14 np0005548918 nova_compute[229246]: 2025-12-06 10:12:14.991 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:16.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:16.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:17 np0005548918 nova_compute[229246]: 2025-12-06 10:12:17.851 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:18.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:18.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:19 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:12:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:19 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:12:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:19 np0005548918 nova_compute[229246]: 2025-12-06 10:12:19.994 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:20.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:20.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:21 np0005548918 podman[238943]: 2025-12-06 10:12:21.158876055 +0000 UTC m=+0.048919875 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Dec  6 05:12:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:22.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:22 np0005548918 nova_compute[229246]: 2025-12-06 10:12:22.853 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:22.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:24.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:24.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:24 np0005548918 nova_compute[229246]: 2025-12-06 10:12:24.995 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3478000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f34680016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:26.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:26.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:27 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f34680016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:27 np0005548918 podman[238984]: 2025-12-06 10:12:27.176214886 +0000 UTC m=+0.064263440 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec  6 05:12:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/101227 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:12:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:27 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3450000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:27 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:27 np0005548918 nova_compute[229246]: 2025-12-06 10:12:27.854 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:28 np0005548918 ovn_controller[132371]: 2025-12-06T10:12:28Z|00062|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec  6 05:12:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:28.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:28.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:29 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:29 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f34680016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:29 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f34500016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:30 np0005548918 nova_compute[229246]: 2025-12-06 10:12:30.042 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:30.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:12:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:30.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:12:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:31 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:31 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:31 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f34680016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:32 np0005548918 nova_compute[229246]: 2025-12-06 10:12:32.856 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:32.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:32.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:33 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f34680016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:33 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:33 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:34.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:34.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:35 np0005548918 nova_compute[229246]: 2025-12-06 10:12:35.044 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:35 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3450001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:35 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:35 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:36.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:36.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:37 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:37 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3450001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:37 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:37 np0005548918 nova_compute[229246]: 2025-12-06 10:12:37.858 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:38.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:38.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:39 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:39 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:39 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3450001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:40 np0005548918 nova_compute[229246]: 2025-12-06 10:12:40.046 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:40.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:40.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:41 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:12:41 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:12:41 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:12:41 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:12:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:41 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:41 np0005548918 nova_compute[229246]: 2025-12-06 10:12:41.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:41 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:41 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:42 np0005548918 nova_compute[229246]: 2025-12-06 10:12:42.860 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:42.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:42.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:43 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f34500032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:43 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f34500032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:43 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:44 np0005548918 podman[239128]: 2025-12-06 10:12:44.252362524 +0000 UTC m=+0.122219430 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec  6 05:12:44 np0005548918 nova_compute[229246]: 2025-12-06 10:12:44.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:44 np0005548918 nova_compute[229246]: 2025-12-06 10:12:44.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:44 np0005548918 nova_compute[229246]: 2025-12-06 10:12:44.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:12:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:44.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:44 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:12:44.906 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:12:44 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:12:44.906 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:12:44 np0005548918 nova_compute[229246]: 2025-12-06 10:12:44.907 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:44.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:45 np0005548918 nova_compute[229246]: 2025-12-06 10:12:45.048 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:45 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:45 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:45 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:46 np0005548918 nova_compute[229246]: 2025-12-06 10:12:46.532 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:46 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:12:46 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:12:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:46.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:46.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:47 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:47 np0005548918 nova_compute[229246]: 2025-12-06 10:12:47.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:47 np0005548918 nova_compute[229246]: 2025-12-06 10:12:47.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:47 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:47 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:47 np0005548918 nova_compute[229246]: 2025-12-06 10:12:47.863 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:48.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:48.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:49 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:49 np0005548918 nova_compute[229246]: 2025-12-06 10:12:49.532 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:49 np0005548918 nova_compute[229246]: 2025-12-06 10:12:49.534 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:49 np0005548918 nova_compute[229246]: 2025-12-06 10:12:49.534 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:12:49 np0005548918 nova_compute[229246]: 2025-12-06 10:12:49.535 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:12:49 np0005548918 nova_compute[229246]: 2025-12-06 10:12:49.549 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:12:49 np0005548918 nova_compute[229246]: 2025-12-06 10:12:49.549 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:49 np0005548918 nova_compute[229246]: 2025-12-06 10:12:49.549 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:12:49 np0005548918 nova_compute[229246]: 2025-12-06 10:12:49.573 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:12:49 np0005548918 nova_compute[229246]: 2025-12-06 10:12:49.573 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:12:49 np0005548918 nova_compute[229246]: 2025-12-06 10:12:49.574 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:12:49 np0005548918 nova_compute[229246]: 2025-12-06 10:12:49.574 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:12:49 np0005548918 nova_compute[229246]: 2025-12-06 10:12:49.575 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:12:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:49 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:49 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.830173) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969830265, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1421, "num_deletes": 255, "total_data_size": 3384085, "memory_usage": 3439976, "flush_reason": "Manual Compaction"}
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969850036, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2208581, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28448, "largest_seqno": 29864, "table_properties": {"data_size": 2202622, "index_size": 3222, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12816, "raw_average_key_size": 19, "raw_value_size": 2190437, "raw_average_value_size": 3308, "num_data_blocks": 142, "num_entries": 662, "num_filter_entries": 662, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015855, "oldest_key_time": 1765015855, "file_creation_time": 1765015969, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 19962 microseconds, and 5454 cpu microseconds.
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.850117) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2208581 bytes OK
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.850156) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.853768) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.853786) EVENT_LOG_v1 {"time_micros": 1765015969853780, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.853811) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 3377363, prev total WAL file size 3377363, number of live WAL files 2.
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.854865) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353033' seq:72057594037927935, type:22 .. '6C6F676D00373534' seq:0, type:0; will stop at (end)
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2156KB)], [54(14MB)]
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969854944, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 17656706, "oldest_snapshot_seqno": -1}
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 6029 keys, 17524941 bytes, temperature: kUnknown
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969953038, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 17524941, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17481041, "index_size": 27726, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15109, "raw_key_size": 153735, "raw_average_key_size": 25, "raw_value_size": 17368476, "raw_average_value_size": 2880, "num_data_blocks": 1135, "num_entries": 6029, "num_filter_entries": 6029, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765015969, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.953273) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 17524941 bytes
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.993869) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 179.9 rd, 178.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 14.7 +0.0 blob) out(16.7 +0.0 blob), read-write-amplify(15.9) write-amplify(7.9) OK, records in: 6555, records dropped: 526 output_compression: NoCompression
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.993902) EVENT_LOG_v1 {"time_micros": 1765015969993891, "job": 32, "event": "compaction_finished", "compaction_time_micros": 98153, "compaction_time_cpu_micros": 31951, "output_level": 6, "num_output_files": 1, "total_output_size": 17524941, "num_input_records": 6555, "num_output_records": 6029, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969994471, "job": 32, "event": "table_file_deletion", "file_number": 56}
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969996740, "job": 32, "event": "table_file_deletion", "file_number": 54}
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.854718) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.996792) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.996797) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.996798) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.996799) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:12:49 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:12:49.996800) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:12:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:12:50 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4068013306' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:12:50 np0005548918 nova_compute[229246]: 2025-12-06 10:12:50.050 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:50 np0005548918 nova_compute[229246]: 2025-12-06 10:12:50.056 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:12:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:50 np0005548918 nova_compute[229246]: 2025-12-06 10:12:50.248 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:12:50 np0005548918 nova_compute[229246]: 2025-12-06 10:12:50.249 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4838MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:12:50 np0005548918 nova_compute[229246]: 2025-12-06 10:12:50.249 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:12:50 np0005548918 nova_compute[229246]: 2025-12-06 10:12:50.250 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:12:50 np0005548918 nova_compute[229246]: 2025-12-06 10:12:50.524 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:12:50 np0005548918 nova_compute[229246]: 2025-12-06 10:12:50.525 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:12:50 np0005548918 nova_compute[229246]: 2025-12-06 10:12:50.544 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:12:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:50.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:50.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:51 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:12:51 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/219018723' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:12:51 np0005548918 nova_compute[229246]: 2025-12-06 10:12:51.044 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:12:51 np0005548918 nova_compute[229246]: 2025-12-06 10:12:51.051 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:12:51 np0005548918 nova_compute[229246]: 2025-12-06 10:12:51.075 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:12:51 np0005548918 nova_compute[229246]: 2025-12-06 10:12:51.076 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:12:51 np0005548918 nova_compute[229246]: 2025-12-06 10:12:51.077 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:12:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:51 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:51 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:51 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:51 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:12:51.908 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:12:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:52 np0005548918 podman[239233]: 2025-12-06 10:12:52.176424705 +0000 UTC m=+0.067010365 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  6 05:12:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:52 np0005548918 nova_compute[229246]: 2025-12-06 10:12:52.865 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:52.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:52.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:53 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:12:53.680 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:12:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:12:53.680 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:12:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:12:53.681 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:12:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:53 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:53 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:54.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:54.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:55 np0005548918 nova_compute[229246]: 2025-12-06 10:12:55.053 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:12:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:55 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:55 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f34500032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:55 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:56.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:12:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:56.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:12:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:57 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:57 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:57 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3444000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:57 np0005548918 nova_compute[229246]: 2025-12-06 10:12:57.867 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:12:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:58 np0005548918 podman[239285]: 2025-12-06 10:12:58.170366262 +0000 UTC m=+0.057390834 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd)
Dec  6 05:12:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:12:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:58.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:12:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:12:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:12:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:58.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:12:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:12:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:59 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:12:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:12:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:59 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:12:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:12:59 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:00 np0005548918 nova_compute[229246]: 2025-12-06 10:13:00.055 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:00.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:00.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:01 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f34440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:01 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:01 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:02 np0005548918 nova_compute[229246]: 2025-12-06 10:13:02.869 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.003000081s ======
Dec  6 05:13:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:02.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000081s
Dec  6 05:13:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:13:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:02.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:13:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:03 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:03 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f34440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:03 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:04.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:04.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:05 np0005548918 nova_compute[229246]: 2025-12-06 10:13:05.057 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:05 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:05 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:05 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f34440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:06 np0005548918 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec  6 05:13:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:06.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:06.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:07 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:07 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:07 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:07 np0005548918 nova_compute[229246]: 2025-12-06 10:13:07.870 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:08.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:08.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:09 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3444002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:09 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:09 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:10 np0005548918 nova_compute[229246]: 2025-12-06 10:13:10.058 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:10.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:13:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:10.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:13:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:11 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:11 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3444002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:11 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:12 np0005548918 nova_compute[229246]: 2025-12-06 10:13:12.872 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:12.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:12.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:13 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:13 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:13 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3444002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:14.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:14.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:15 np0005548918 nova_compute[229246]: 2025-12-06 10:13:15.059 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:15 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:15 np0005548918 podman[239351]: 2025-12-06 10:13:15.211233952 +0000 UTC m=+0.098544668 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  6 05:13:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:15 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:15 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:13:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:16.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:13:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:16.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:17 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:17 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:17 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:17 np0005548918 nova_compute[229246]: 2025-12-06 10:13:17.875 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:18.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:18.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:19 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:19 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:19 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:20 np0005548918 nova_compute[229246]: 2025-12-06 10:13:20.061 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:20.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:20.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:21 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:21 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:21 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:22 np0005548918 nova_compute[229246]: 2025-12-06 10:13:22.878 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:22.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:22.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:23 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:23 np0005548918 podman[239385]: 2025-12-06 10:13:23.17108306 +0000 UTC m=+0.058429594 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  6 05:13:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:23 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:23 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:24.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:24.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:25 np0005548918 nova_compute[229246]: 2025-12-06 10:13:25.064 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3468002fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:25 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f345c003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:13:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:26.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:13:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:26.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:27 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3444003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:27 np0005548918 kernel: ganesha.nfsd[238981]: segfault at 50 ip 00007f352816132e sp 00007f34e3ffe210 error 4 in libntirpc.so.5.8[7f3528146000+2c000] likely on CPU 1 (core 0, socket 1)
Dec  6 05:13:27 np0005548918 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 05:13:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[238893]: 06/12/2025 10:13:27 : epoch 6934017d : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3454003db0 fd 38 proxy ignored for local
Dec  6 05:13:27 np0005548918 systemd[1]: Started Process Core Dump (PID 239412/UID 0).
Dec  6 05:13:27 np0005548918 nova_compute[229246]: 2025-12-06 10:13:27.880 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:28 np0005548918 systemd-coredump[239413]: Process 238898 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 53:#012#0  0x00007f352816132e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 05:13:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:13:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:28.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:13:28 np0005548918 systemd[1]: systemd-coredump@8-239412-0.service: Deactivated successfully.
Dec  6 05:13:28 np0005548918 systemd[1]: systemd-coredump@8-239412-0.service: Consumed 1.092s CPU time.
Dec  6 05:13:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:13:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:28.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:13:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:29 np0005548918 podman[239420]: 2025-12-06 10:13:29.02417185 +0000 UTC m=+0.026935342 container died f5915d1f61f0eefc7354eaa2f368c1cf2cbfabbe32685522cce767b40ee657ad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  6 05:13:29 np0005548918 systemd[1]: var-lib-containers-storage-overlay-81e0c7b5904a7a7a5613ddac362593e6279c221962a52de33a4e5021d555154c-merged.mount: Deactivated successfully.
Dec  6 05:13:29 np0005548918 podman[239420]: 2025-12-06 10:13:29.068837725 +0000 UTC m=+0.071601197 container remove f5915d1f61f0eefc7354eaa2f368c1cf2cbfabbe32685522cce767b40ee657ad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  6 05:13:29 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Main process exited, code=exited, status=139/n/a
Dec  6 05:13:29 np0005548918 podman[239419]: 2025-12-06 10:13:29.079974943 +0000 UTC m=+0.088425737 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:13:29 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Failed with result 'exit-code'.
Dec  6 05:13:29 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.529s CPU time.
Dec  6 05:13:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/101329 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:13:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:30 np0005548918 nova_compute[229246]: 2025-12-06 10:13:30.066 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:30.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:30.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:32 np0005548918 nova_compute[229246]: 2025-12-06 10:13:32.881 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:32.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:32.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/101333 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:13:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:34.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:13:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:34.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:13:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:35 np0005548918 nova_compute[229246]: 2025-12-06 10:13:35.067 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:36.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:36.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:37 np0005548918 nova_compute[229246]: 2025-12-06 10:13:37.882 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:38.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:13:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:38.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:13:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:39 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Scheduled restart job, restart counter is at 9.
Dec  6 05:13:39 np0005548918 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:13:39 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.529s CPU time.
Dec  6 05:13:39 np0005548918 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec  6 05:13:39 np0005548918 podman[239567]: 2025-12-06 10:13:39.576236602 +0000 UTC m=+0.056808452 container create e0ec4d7cccdd0c825989c42c82ae89eddf385a57b9308109ee802c61948bb4ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  6 05:13:39 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b7ca7e97f3abb57b185ebb8059b67b96028c2cdfdce1eef328b75ca6d06a511/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  6 05:13:39 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b7ca7e97f3abb57b185ebb8059b67b96028c2cdfdce1eef328b75ca6d06a511/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:13:39 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b7ca7e97f3abb57b185ebb8059b67b96028c2cdfdce1eef328b75ca6d06a511/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  6 05:13:39 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b7ca7e97f3abb57b185ebb8059b67b96028c2cdfdce1eef328b75ca6d06a511/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.sseuqb-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  6 05:13:39 np0005548918 podman[239567]: 2025-12-06 10:13:39.545917821 +0000 UTC m=+0.026489761 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  6 05:13:39 np0005548918 podman[239567]: 2025-12-06 10:13:39.645526145 +0000 UTC m=+0.126098005 container init e0ec4d7cccdd0c825989c42c82ae89eddf385a57b9308109ee802c61948bb4ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Dec  6 05:13:39 np0005548918 podman[239567]: 2025-12-06 10:13:39.651124625 +0000 UTC m=+0.131696465 container start e0ec4d7cccdd0c825989c42c82ae89eddf385a57b9308109ee802c61948bb4ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  6 05:13:39 np0005548918 bash[239567]: e0ec4d7cccdd0c825989c42c82ae89eddf385a57b9308109ee802c61948bb4ce
Dec  6 05:13:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:39 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  6 05:13:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:39 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  6 05:13:39 np0005548918 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:13:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:39 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  6 05:13:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:39 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  6 05:13:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:39 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  6 05:13:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:39 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  6 05:13:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:39 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  6 05:13:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:39 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:13:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:40 np0005548918 nova_compute[229246]: 2025-12-06 10:13:40.069 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:13:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:40.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:13:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:13:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:40.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:13:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:42 np0005548918 nova_compute[229246]: 2025-12-06 10:13:42.885 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:42.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:42.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:44.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:13:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:44.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:13:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:45 np0005548918 nova_compute[229246]: 2025-12-06 10:13:45.062 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:45 np0005548918 nova_compute[229246]: 2025-12-06 10:13:45.063 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:45 np0005548918 nova_compute[229246]: 2025-12-06 10:13:45.071 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:45 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec  6 05:13:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:45 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec  6 05:13:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:45 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:13:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:45 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:13:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:45 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:13:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:46 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:13:46.022 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:13:46 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:13:46.024 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:13:46 np0005548918 nova_compute[229246]: 2025-12-06 10:13:46.024 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:46 np0005548918 podman[239654]: 2025-12-06 10:13:46.202000545 +0000 UTC m=+0.154395931 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:13:46 np0005548918 nova_compute[229246]: 2025-12-06 10:13:46.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:46 np0005548918 nova_compute[229246]: 2025-12-06 10:13:46.537 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:13:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:46 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:13:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:46 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:13:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:46 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:13:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:46 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  6 05:13:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:46 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:13:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:46 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:13:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:46 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:13:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:46.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:13:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:46.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:13:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:47 np0005548918 nova_compute[229246]: 2025-12-06 10:13:47.538 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:47 np0005548918 nova_compute[229246]: 2025-12-06 10:13:47.887 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:13:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:47 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:13:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:48 np0005548918 nova_compute[229246]: 2025-12-06 10:13:48.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:48.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:13:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:48.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:13:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:49 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:13:49.026 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:13:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/101349 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:13:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:50 np0005548918 nova_compute[229246]: 2025-12-06 10:13:50.073 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:50 np0005548918 nova_compute[229246]: 2025-12-06 10:13:50.530 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:50 np0005548918 nova_compute[229246]: 2025-12-06 10:13:50.534 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:50 np0005548918 nova_compute[229246]: 2025-12-06 10:13:50.534 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:50 np0005548918 nova_compute[229246]: 2025-12-06 10:13:50.557 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:13:50 np0005548918 nova_compute[229246]: 2025-12-06 10:13:50.557 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:13:50 np0005548918 nova_compute[229246]: 2025-12-06 10:13:50.557 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:13:50 np0005548918 nova_compute[229246]: 2025-12-06 10:13:50.558 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:13:50 np0005548918 nova_compute[229246]: 2025-12-06 10:13:50.558 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:13:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:50.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:50.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:51 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:13:51 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2027604282' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:13:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:51 np0005548918 nova_compute[229246]: 2025-12-06 10:13:51.031 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:13:51 np0005548918 nova_compute[229246]: 2025-12-06 10:13:51.169 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:13:51 np0005548918 nova_compute[229246]: 2025-12-06 10:13:51.170 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4852MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:13:51 np0005548918 nova_compute[229246]: 2025-12-06 10:13:51.170 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:13:51 np0005548918 nova_compute[229246]: 2025-12-06 10:13:51.170 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:13:51 np0005548918 nova_compute[229246]: 2025-12-06 10:13:51.238 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:13:51 np0005548918 nova_compute[229246]: 2025-12-06 10:13:51.239 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:13:51 np0005548918 nova_compute[229246]: 2025-12-06 10:13:51.256 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:13:51 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:13:51 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2774251911' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:13:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:51 np0005548918 nova_compute[229246]: 2025-12-06 10:13:51.684 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:13:51 np0005548918 nova_compute[229246]: 2025-12-06 10:13:51.690 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:13:51 np0005548918 nova_compute[229246]: 2025-12-06 10:13:51.709 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:13:51 np0005548918 nova_compute[229246]: 2025-12-06 10:13:51.711 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:13:51 np0005548918 nova_compute[229246]: 2025-12-06 10:13:51.711 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:52 np0005548918 nova_compute[229246]: 2025-12-06 10:13:52.713 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:13:52 np0005548918 nova_compute[229246]: 2025-12-06 10:13:52.713 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:13:52 np0005548918 nova_compute[229246]: 2025-12-06 10:13:52.713 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:13:52 np0005548918 nova_compute[229246]: 2025-12-06 10:13:52.741 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:13:52 np0005548918 nova_compute[229246]: 2025-12-06 10:13:52.889 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:52.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000020:nfs.cephfs.1: -2
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:13:52 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:52 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  6 05:13:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:52 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  6 05:13:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:53.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:53 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:13:53.681 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:13:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:13:53.682 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:13:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:13:53.682 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:13:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:53 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:53 np0005548918 podman[239918]: 2025-12-06 10:13:53.806066794 +0000 UTC m=+0.071461883 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible)
Dec  6 05:13:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:53 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:54.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:55.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:55 np0005548918 nova_compute[229246]: 2025-12-06 10:13:55.076 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:13:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:55 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/101355 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:13:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:55 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:55 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:56.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:13:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:57.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:13:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:57 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:57 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:57 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:57 np0005548918 nova_compute[229246]: 2025-12-06 10:13:57.890 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:13:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:13:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:58.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:13:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:13:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:13:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:59.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:13:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:13:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:59 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:13:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:13:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:59 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:13:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:13:59 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:00 np0005548918 nova_compute[229246]: 2025-12-06 10:14:00.080 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:00 np0005548918 podman[239944]: 2025-12-06 10:14:00.199135661 +0000 UTC m=+0.080971678 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 05:14:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:00.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:01.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:01 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:01 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:01 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:02 np0005548918 nova_compute[229246]: 2025-12-06 10:14:02.891 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:02.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:03.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:03 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:03 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa4001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:03 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:04.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:05.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:05 np0005548918 nova_compute[229246]: 2025-12-06 10:14:05.082 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:05 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:05 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:05 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa4001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:14:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:06.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:14:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:07.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:07 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:07 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:07 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:07 np0005548918 nova_compute[229246]: 2025-12-06 10:14:07.941 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:14:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:08.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:14:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:09.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:09 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa4002160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:09 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:09 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:10 np0005548918 nova_compute[229246]: 2025-12-06 10:14:10.127 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:10.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:11.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:11 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:11 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:11 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:12 np0005548918 nova_compute[229246]: 2025-12-06 10:14:12.942 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:14:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:12.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:14:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:14:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:13.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:14:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:13 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:13 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:13 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:14.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:15.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:15 np0005548918 nova_compute[229246]: 2025-12-06 10:14:15.130 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:15 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:15 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:15 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:14:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:16.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:14:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:17.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:17 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:17 np0005548918 podman[240007]: 2025-12-06 10:14:17.280701485 +0000 UTC m=+0.156858148 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  6 05:14:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:17 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/101417 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:14:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:17 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:17 np0005548918 nova_compute[229246]: 2025-12-06 10:14:17.944 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.566571) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058566595, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1434, "num_deletes": 503, "total_data_size": 2830874, "memory_usage": 2884736, "flush_reason": "Manual Compaction"}
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058580935, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 1843544, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29869, "largest_seqno": 31298, "table_properties": {"data_size": 1837641, "index_size": 2723, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 16377, "raw_average_key_size": 19, "raw_value_size": 1823788, "raw_average_value_size": 2202, "num_data_blocks": 117, "num_entries": 828, "num_filter_entries": 828, "num_deletions": 503, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015970, "oldest_key_time": 1765015970, "file_creation_time": 1765016058, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 14404 microseconds, and 4744 cpu microseconds.
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.580971) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 1843544 bytes OK
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.580990) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.583077) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.583089) EVENT_LOG_v1 {"time_micros": 1765016058583085, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.583103) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 2823129, prev total WAL file size 2823129, number of live WAL files 2.
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.583868) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(1800KB)], [57(16MB)]
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058583892, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 19368485, "oldest_snapshot_seqno": -1}
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5832 keys, 13151191 bytes, temperature: kUnknown
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058648846, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 13151191, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13113676, "index_size": 21853, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14597, "raw_key_size": 150769, "raw_average_key_size": 25, "raw_value_size": 13009515, "raw_average_value_size": 2230, "num_data_blocks": 875, "num_entries": 5832, "num_filter_entries": 5832, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765016058, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.649112) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 13151191 bytes
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.668799) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 297.8 rd, 202.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 16.7 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(17.6) write-amplify(7.1) OK, records in: 6857, records dropped: 1025 output_compression: NoCompression
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.668828) EVENT_LOG_v1 {"time_micros": 1765016058668817, "job": 34, "event": "compaction_finished", "compaction_time_micros": 65040, "compaction_time_cpu_micros": 25191, "output_level": 6, "num_output_files": 1, "total_output_size": 13151191, "num_input_records": 6857, "num_output_records": 5832, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058669412, "job": 34, "event": "table_file_deletion", "file_number": 59}
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058673417, "job": 34, "event": "table_file_deletion", "file_number": 57}
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.583792) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.673544) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.673552) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.673555) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.673558) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:14:18 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:14:18.673561) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:14:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:18.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:19.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:19 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:19 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:19 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:20 np0005548918 nova_compute[229246]: 2025-12-06 10:14:20.131 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:14:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:20.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:14:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:21.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.068 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.069 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.100 229250 DEBUG nova.compute.manager [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.173 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.173 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.179 229250 DEBUG nova.virt.hardware [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.180 229250 INFO nova.compute.claims [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec  6 05:14:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:21 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.264 229250 DEBUG oslo_concurrency.processutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:14:21 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:14:21 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1807636740' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:14:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.693 229250 DEBUG oslo_concurrency.processutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.699 229250 DEBUG nova.compute.provider_tree [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.717 229250 DEBUG nova.scheduler.client.report [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.748 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.749 229250 DEBUG nova.compute.manager [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  6 05:14:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:21 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.820 229250 DEBUG nova.compute.manager [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.821 229250 DEBUG nova.network.neutron [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.843 229250 INFO nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  6 05:14:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:21 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.865 229250 DEBUG nova.compute.manager [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.982 229250 DEBUG nova.compute.manager [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.983 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  6 05:14:21 np0005548918 nova_compute[229246]: 2025-12-06 10:14:21.983 229250 INFO nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Creating image(s)#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.012 229250 DEBUG nova.storage.rbd_utils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image 8ec1d789-8c20-4dd3-a44b-5565d1293cea_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:14:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.046 229250 DEBUG nova.storage.rbd_utils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image 8ec1d789-8c20-4dd3-a44b-5565d1293cea_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.081 229250 DEBUG nova.storage.rbd_utils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image 8ec1d789-8c20-4dd3-a44b-5565d1293cea_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.086 229250 DEBUG oslo_concurrency.processutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.164 229250 DEBUG oslo_concurrency.processutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.165 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "1b7208203e670301d076a006cb3364d3eb842050" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.166 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "1b7208203e670301d076a006cb3364d3eb842050" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.166 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "1b7208203e670301d076a006cb3364d3eb842050" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.191 229250 DEBUG nova.storage.rbd_utils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image 8ec1d789-8c20-4dd3-a44b-5565d1293cea_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.195 229250 DEBUG oslo_concurrency.processutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050 8ec1d789-8c20-4dd3-a44b-5565d1293cea_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.456 229250 DEBUG nova.policy [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '03615580775245e6ae335ee9d785611f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '92b402c8d3e2476abc98be42a1e6d34e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.541 229250 DEBUG oslo_concurrency.processutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/1b7208203e670301d076a006cb3364d3eb842050 8ec1d789-8c20-4dd3-a44b-5565d1293cea_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.346s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.607 229250 DEBUG nova.storage.rbd_utils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] resizing rbd image 8ec1d789-8c20-4dd3-a44b-5565d1293cea_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Dec  6 05:14:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.691 229250 DEBUG nova.objects.instance [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lazy-loading 'migration_context' on Instance uuid 8ec1d789-8c20-4dd3-a44b-5565d1293cea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.706 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.706 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Ensure instance console log exists: /var/lib/nova/instances/8ec1d789-8c20-4dd3-a44b-5565d1293cea/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.707 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.708 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.709 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:22.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:22 np0005548918 nova_compute[229246]: 2025-12-06 10:14:22.997 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:23.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:23 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:23 np0005548918 nova_compute[229246]: 2025-12-06 10:14:23.388 229250 DEBUG nova.network.neutron [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Successfully created port: 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  6 05:14:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:23 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:23 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:24 np0005548918 podman[240230]: 2025-12-06 10:14:24.218106825 +0000 UTC m=+0.095135366 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true)
Dec  6 05:14:24 np0005548918 nova_compute[229246]: 2025-12-06 10:14:24.438 229250 DEBUG nova.network.neutron [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Successfully updated port: 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  6 05:14:24 np0005548918 nova_compute[229246]: 2025-12-06 10:14:24.459 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:14:24 np0005548918 nova_compute[229246]: 2025-12-06 10:14:24.460 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquired lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:14:24 np0005548918 nova_compute[229246]: 2025-12-06 10:14:24.460 229250 DEBUG nova.network.neutron [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  6 05:14:24 np0005548918 nova_compute[229246]: 2025-12-06 10:14:24.524 229250 DEBUG nova.compute.manager [req-39a5a370-1d82-44a6-ae07-bbc9fc2cad7c req-7d461ace-b03b-46cd-b321-6f07ed05f953 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received event network-changed-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:14:24 np0005548918 nova_compute[229246]: 2025-12-06 10:14:24.524 229250 DEBUG nova.compute.manager [req-39a5a370-1d82-44a6-ae07-bbc9fc2cad7c req-7d461ace-b03b-46cd-b321-6f07ed05f953 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Refreshing instance network info cache due to event network-changed-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  6 05:14:24 np0005548918 nova_compute[229246]: 2025-12-06 10:14:24.524 229250 DEBUG oslo_concurrency.lockutils [req-39a5a370-1d82-44a6-ae07-bbc9fc2cad7c req-7d461ace-b03b-46cd-b321-6f07ed05f953 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:14:24 np0005548918 nova_compute[229246]: 2025-12-06 10:14:24.616 229250 DEBUG nova.network.neutron [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  6 05:14:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:24.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:25.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.171 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:25 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.592 229250 DEBUG nova.network.neutron [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Updating instance_info_cache with network_info: [{"id": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "address": "fa:16:3e:ac:c2:fe", "network": {"id": "ef8aaff1-03b0-4544-89c9-035c25f01e5c", "bridge": "br-int", "label": "tempest-network-smoke--1887948682", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap887ea51a-ea", "ovs_interfaceid": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.614 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Releasing lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.614 229250 DEBUG nova.compute.manager [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Instance network_info: |[{"id": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "address": "fa:16:3e:ac:c2:fe", "network": {"id": "ef8aaff1-03b0-4544-89c9-035c25f01e5c", "bridge": "br-int", "label": "tempest-network-smoke--1887948682", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap887ea51a-ea", "ovs_interfaceid": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.615 229250 DEBUG oslo_concurrency.lockutils [req-39a5a370-1d82-44a6-ae07-bbc9fc2cad7c req-7d461ace-b03b-46cd-b321-6f07ed05f953 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquired lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.615 229250 DEBUG nova.network.neutron [req-39a5a370-1d82-44a6-ae07-bbc9fc2cad7c req-7d461ace-b03b-46cd-b321-6f07ed05f953 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Refreshing network info cache for port 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.620 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Start _get_guest_xml network_info=[{"id": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "address": "fa:16:3e:ac:c2:fe", "network": {"id": "ef8aaff1-03b0-4544-89c9-035c25f01e5c", "bridge": "br-int", "label": "tempest-network-smoke--1887948682", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap887ea51a-ea", "ovs_interfaceid": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:04:42Z,direct_url=<?>,disk_format='qcow2',id=9489b8a5-a798-4e26-87f9-59bb1eb2e6fd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3e0ab101ca7547d4a515169a0f2edef3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-06T10:04:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '9489b8a5-a798-4e26-87f9-59bb1eb2e6fd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.625 229250 WARNING nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.630 229250 DEBUG nova.virt.libvirt.host [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.630 229250 DEBUG nova.virt.libvirt.host [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.638 229250 DEBUG nova.virt.libvirt.host [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.639 229250 DEBUG nova.virt.libvirt.host [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.639 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.640 229250 DEBUG nova.virt.hardware [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-06T10:04:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0a252b9c-cc5f-41b2-a8b2-94fcf6e74d22',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:04:42Z,direct_url=<?>,disk_format='qcow2',id=9489b8a5-a798-4e26-87f9-59bb1eb2e6fd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3e0ab101ca7547d4a515169a0f2edef3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-06T10:04:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.640 229250 DEBUG nova.virt.hardware [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.640 229250 DEBUG nova.virt.hardware [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.640 229250 DEBUG nova.virt.hardware [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.641 229250 DEBUG nova.virt.hardware [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.641 229250 DEBUG nova.virt.hardware [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.641 229250 DEBUG nova.virt.hardware [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.641 229250 DEBUG nova.virt.hardware [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.641 229250 DEBUG nova.virt.hardware [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.642 229250 DEBUG nova.virt.hardware [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.642 229250 DEBUG nova.virt.hardware [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  6 05:14:25 np0005548918 nova_compute[229246]: 2025-12-06 10:14:25.645 229250 DEBUG oslo_concurrency.processutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:14:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:25 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  6 05:14:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:25 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:25 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:26 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  6 05:14:26 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/955708754' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.076 229250 DEBUG oslo_concurrency.processutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.107 229250 DEBUG nova.storage.rbd_utils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image 8ec1d789-8c20-4dd3-a44b-5565d1293cea_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.111 229250 DEBUG oslo_concurrency.processutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:14:26 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  6 05:14:26 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2572115795' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.549 229250 DEBUG oslo_concurrency.processutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.551 229250 DEBUG nova.virt.libvirt.vif [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T10:14:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2062141463',display_name='tempest-TestNetworkBasicOps-server-2062141463',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2062141463',id=12,image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDiXUgLqzE4HhCWF2FxGckt3FoPTmhlzqmtYr2BE7YoZEjAsKDyP7+VvUeKn/wEuGf5swQ9zHFbCj1Cz5Vthaw3uQeZQm81Uuov1u5HA4+nHd2yE4AljdVzf3CoWLHUdng==',key_name='tempest-TestNetworkBasicOps-875173470',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='92b402c8d3e2476abc98be42a1e6d34e',ramdisk_id='',reservation_id='r-w3v180si',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1971100882',owner_user_name='tempest-TestNetworkBasicOps-1971100882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:14:21Z,user_data=None,user_id='03615580775245e6ae335ee9d785611f',uuid=8ec1d789-8c20-4dd3-a44b-5565d1293cea,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "address": "fa:16:3e:ac:c2:fe", "network": {"id": "ef8aaff1-03b0-4544-89c9-035c25f01e5c", "bridge": "br-int", "label": "tempest-network-smoke--1887948682", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap887ea51a-ea", "ovs_interfaceid": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.551 229250 DEBUG nova.network.os_vif_util [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converting VIF {"id": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "address": "fa:16:3e:ac:c2:fe", "network": {"id": "ef8aaff1-03b0-4544-89c9-035c25f01e5c", "bridge": "br-int", "label": "tempest-network-smoke--1887948682", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap887ea51a-ea", "ovs_interfaceid": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.552 229250 DEBUG nova.network.os_vif_util [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:c2:fe,bridge_name='br-int',has_traffic_filtering=True,id=887ea51a-eae0-4aca-80e0-9a3c0d8b60b7,network=Network(ef8aaff1-03b0-4544-89c9-035c25f01e5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap887ea51a-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.553 229250 DEBUG nova.objects.instance [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lazy-loading 'pci_devices' on Instance uuid 8ec1d789-8c20-4dd3-a44b-5565d1293cea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.591 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] End _get_guest_xml xml=<domain type="kvm">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  <uuid>8ec1d789-8c20-4dd3-a44b-5565d1293cea</uuid>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  <name>instance-0000000c</name>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  <memory>131072</memory>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  <vcpu>1</vcpu>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  <metadata>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <nova:name>tempest-TestNetworkBasicOps-server-2062141463</nova:name>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <nova:creationTime>2025-12-06 10:14:25</nova:creationTime>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <nova:flavor name="m1.nano">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <nova:memory>128</nova:memory>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <nova:disk>1</nova:disk>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <nova:swap>0</nova:swap>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <nova:ephemeral>0</nova:ephemeral>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <nova:vcpus>1</nova:vcpus>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      </nova:flavor>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <nova:owner>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <nova:user uuid="03615580775245e6ae335ee9d785611f">tempest-TestNetworkBasicOps-1971100882-project-member</nova:user>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <nova:project uuid="92b402c8d3e2476abc98be42a1e6d34e">tempest-TestNetworkBasicOps-1971100882</nova:project>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      </nova:owner>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <nova:root type="image" uuid="9489b8a5-a798-4e26-87f9-59bb1eb2e6fd"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <nova:ports>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <nova:port uuid="887ea51a-eae0-4aca-80e0-9a3c0d8b60b7">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        </nova:port>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      </nova:ports>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    </nova:instance>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  </metadata>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  <sysinfo type="smbios">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <system>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <entry name="manufacturer">RDO</entry>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <entry name="product">OpenStack Compute</entry>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <entry name="serial">8ec1d789-8c20-4dd3-a44b-5565d1293cea</entry>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <entry name="uuid">8ec1d789-8c20-4dd3-a44b-5565d1293cea</entry>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <entry name="family">Virtual Machine</entry>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    </system>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  </sysinfo>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  <os>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <boot dev="hd"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <smbios mode="sysinfo"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  </os>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  <features>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <acpi/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <apic/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <vmcoreinfo/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  </features>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  <clock offset="utc">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <timer name="pit" tickpolicy="delay"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <timer name="hpet" present="no"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  </clock>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  <cpu mode="host-model" match="exact">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <topology sockets="1" cores="1" threads="1"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  </cpu>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  <devices>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <disk type="network" device="disk">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <driver type="raw" cache="none"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <source protocol="rbd" name="vms/8ec1d789-8c20-4dd3-a44b-5565d1293cea_disk">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <host name="192.168.122.100" port="6789"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <host name="192.168.122.102" port="6789"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <host name="192.168.122.101" port="6789"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      </source>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <auth username="openstack">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <secret type="ceph" uuid="5ecd3f74-dade-5fc4-92ce-8950ae424258"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      </auth>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <target dev="vda" bus="virtio"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    </disk>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <disk type="network" device="cdrom">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <driver type="raw" cache="none"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <source protocol="rbd" name="vms/8ec1d789-8c20-4dd3-a44b-5565d1293cea_disk.config">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <host name="192.168.122.100" port="6789"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <host name="192.168.122.102" port="6789"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <host name="192.168.122.101" port="6789"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      </source>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <auth username="openstack">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:        <secret type="ceph" uuid="5ecd3f74-dade-5fc4-92ce-8950ae424258"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      </auth>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <target dev="sda" bus="sata"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    </disk>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <interface type="ethernet">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <mac address="fa:16:3e:ac:c2:fe"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <model type="virtio"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <driver name="vhost" rx_queue_size="512"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <mtu size="1442"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <target dev="tap887ea51a-ea"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    </interface>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <serial type="pty">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <log file="/var/lib/nova/instances/8ec1d789-8c20-4dd3-a44b-5565d1293cea/console.log" append="off"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    </serial>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <video>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <model type="virtio"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    </video>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <input type="tablet" bus="usb"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <rng model="virtio">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <backend model="random">/dev/urandom</backend>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    </rng>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="pci" model="pcie-root-port"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <controller type="usb" index="0"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    <memballoon model="virtio">
Dec  6 05:14:26 np0005548918 nova_compute[229246]:      <stats period="10"/>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:    </memballoon>
Dec  6 05:14:26 np0005548918 nova_compute[229246]:  </devices>
Dec  6 05:14:26 np0005548918 nova_compute[229246]: </domain>
Dec  6 05:14:26 np0005548918 nova_compute[229246]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.592 229250 DEBUG nova.compute.manager [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Preparing to wait for external event network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.593 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.593 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.593 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.594 229250 DEBUG nova.virt.libvirt.vif [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T10:14:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2062141463',display_name='tempest-TestNetworkBasicOps-server-2062141463',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2062141463',id=12,image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDiXUgLqzE4HhCWF2FxGckt3FoPTmhlzqmtYr2BE7YoZEjAsKDyP7+VvUeKn/wEuGf5swQ9zHFbCj1Cz5Vthaw3uQeZQm81Uuov1u5HA4+nHd2yE4AljdVzf3CoWLHUdng==',key_name='tempest-TestNetworkBasicOps-875173470',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='92b402c8d3e2476abc98be42a1e6d34e',ramdisk_id='',reservation_id='r-w3v180si',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1971100882',owner_user_name='tempest-TestNetworkBasicOps-1971100882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:14:21Z,user_data=None,user_id='03615580775245e6ae335ee9d785611f',uuid=8ec1d789-8c20-4dd3-a44b-5565d1293cea,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "address": "fa:16:3e:ac:c2:fe", "network": {"id": "ef8aaff1-03b0-4544-89c9-035c25f01e5c", "bridge": "br-int", "label": "tempest-network-smoke--1887948682", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap887ea51a-ea", "ovs_interfaceid": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.594 229250 DEBUG nova.network.os_vif_util [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converting VIF {"id": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "address": "fa:16:3e:ac:c2:fe", "network": {"id": "ef8aaff1-03b0-4544-89c9-035c25f01e5c", "bridge": "br-int", "label": "tempest-network-smoke--1887948682", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap887ea51a-ea", "ovs_interfaceid": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.594 229250 DEBUG nova.network.os_vif_util [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:c2:fe,bridge_name='br-int',has_traffic_filtering=True,id=887ea51a-eae0-4aca-80e0-9a3c0d8b60b7,network=Network(ef8aaff1-03b0-4544-89c9-035c25f01e5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap887ea51a-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.595 229250 DEBUG os_vif [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:c2:fe,bridge_name='br-int',has_traffic_filtering=True,id=887ea51a-eae0-4aca-80e0-9a3c0d8b60b7,network=Network(ef8aaff1-03b0-4544-89c9-035c25f01e5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap887ea51a-ea') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.596 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.596 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.596 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.599 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.599 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap887ea51a-ea, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.600 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap887ea51a-ea, col_values=(('external_ids', {'iface-id': '887ea51a-eae0-4aca-80e0-9a3c0d8b60b7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:c2:fe', 'vm-uuid': '8ec1d789-8c20-4dd3-a44b-5565d1293cea'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.601 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:26 np0005548918 NetworkManager[48884]: <info>  [1765016066.6021] manager: (tap887ea51a-ea): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.604 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.609 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.610 229250 INFO os_vif [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:c2:fe,bridge_name='br-int',has_traffic_filtering=True,id=887ea51a-eae0-4aca-80e0-9a3c0d8b60b7,network=Network(ef8aaff1-03b0-4544-89c9-035c25f01e5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap887ea51a-ea')#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.661 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.661 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.661 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] No VIF found with MAC fa:16:3e:ac:c2:fe, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.662 229250 INFO nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Using config drive#033[00m
Dec  6 05:14:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.691 229250 DEBUG nova.storage.rbd_utils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image 8ec1d789-8c20-4dd3-a44b-5565d1293cea_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.759 229250 DEBUG nova.network.neutron [req-39a5a370-1d82-44a6-ae07-bbc9fc2cad7c req-7d461ace-b03b-46cd-b321-6f07ed05f953 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Updated VIF entry in instance network info cache for port 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.759 229250 DEBUG nova.network.neutron [req-39a5a370-1d82-44a6-ae07-bbc9fc2cad7c req-7d461ace-b03b-46cd-b321-6f07ed05f953 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Updating instance_info_cache with network_info: [{"id": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "address": "fa:16:3e:ac:c2:fe", "network": {"id": "ef8aaff1-03b0-4544-89c9-035c25f01e5c", "bridge": "br-int", "label": "tempest-network-smoke--1887948682", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap887ea51a-ea", "ovs_interfaceid": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.774 229250 DEBUG oslo_concurrency.lockutils [req-39a5a370-1d82-44a6-ae07-bbc9fc2cad7c req-7d461ace-b03b-46cd-b321-6f07ed05f953 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Releasing lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:14:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:26.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.982 229250 INFO nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Creating config drive at /var/lib/nova/instances/8ec1d789-8c20-4dd3-a44b-5565d1293cea/disk.config#033[00m
Dec  6 05:14:26 np0005548918 nova_compute[229246]: 2025-12-06 10:14:26.986 229250 DEBUG oslo_concurrency.processutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8ec1d789-8c20-4dd3-a44b-5565d1293cea/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpseyqjtne execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:14:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:27.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.122 229250 DEBUG oslo_concurrency.processutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8ec1d789-8c20-4dd3-a44b-5565d1293cea/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpseyqjtne" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.155 229250 DEBUG nova.storage.rbd_utils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] rbd image 8ec1d789-8c20-4dd3-a44b-5565d1293cea_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.159 229250 DEBUG oslo_concurrency.processutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8ec1d789-8c20-4dd3-a44b-5565d1293cea/disk.config 8ec1d789-8c20-4dd3-a44b-5565d1293cea_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:14:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:27 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.314 229250 DEBUG oslo_concurrency.processutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8ec1d789-8c20-4dd3-a44b-5565d1293cea/disk.config 8ec1d789-8c20-4dd3-a44b-5565d1293cea_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.155s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.315 229250 INFO nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Deleting local config drive /var/lib/nova/instances/8ec1d789-8c20-4dd3-a44b-5565d1293cea/disk.config because it was imported into RBD.#033[00m
Dec  6 05:14:27 np0005548918 systemd[1]: Starting libvirt secret daemon...
Dec  6 05:14:27 np0005548918 systemd[1]: Started libvirt secret daemon.
Dec  6 05:14:27 np0005548918 kernel: tap887ea51a-ea: entered promiscuous mode
Dec  6 05:14:27 np0005548918 NetworkManager[48884]: <info>  [1765016067.4360] manager: (tap887ea51a-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.498 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:27 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:27Z|00063|binding|INFO|Claiming lport 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 for this chassis.
Dec  6 05:14:27 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:27Z|00064|binding|INFO|887ea51a-eae0-4aca-80e0-9a3c0d8b60b7: Claiming fa:16:3e:ac:c2:fe 10.100.0.8
Dec  6 05:14:27 np0005548918 systemd-udevd[240405]: Network interface NamePolicy= disabled on kernel command line.
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.504 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:27 np0005548918 NetworkManager[48884]: <info>  [1765016067.5173] device (tap887ea51a-ea): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  6 05:14:27 np0005548918 NetworkManager[48884]: <info>  [1765016067.5188] device (tap887ea51a-ea): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  6 05:14:27 np0005548918 systemd-machined[192688]: New machine qemu-4-instance-0000000c.
Dec  6 05:14:27 np0005548918 NetworkManager[48884]: <info>  [1765016067.5293] manager: (patch-provnet-c81e973e-7ff9-4cd2-9994-daf87649321f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.527 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:27 np0005548918 NetworkManager[48884]: <info>  [1765016067.5300] manager: (patch-br-int-to-provnet-c81e973e-7ff9-4cd2-9994-daf87649321f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.535 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:c2:fe 10.100.0.8'], port_security=['fa:16:3e:ac:c2:fe 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8ec1d789-8c20-4dd3-a44b-5565d1293cea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef8aaff1-03b0-4544-89c9-035c25f01e5c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92b402c8d3e2476abc98be42a1e6d34e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '62f9558a-9b40-40ff-98c1-c5a43bf7599d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1a37e6e-1014-49d4-9543-ee1567988851, chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], logical_port=887ea51a-eae0-4aca-80e0-9a3c0d8b60b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.536 141640 INFO neutron.agent.ovn.metadata.agent [-] Port 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 in datapath ef8aaff1-03b0-4544-89c9-035c25f01e5c bound to our chassis#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.536 141640 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ef8aaff1-03b0-4544-89c9-035c25f01e5c#033[00m
Dec  6 05:14:27 np0005548918 systemd[1]: Started Virtual Machine qemu-4-instance-0000000c.
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.549 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[5052e709-d95e-4a4e-9cbb-03a2ad969fae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.550 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapef8aaff1-01 in ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.552 233203 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapef8aaff1-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.552 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[e4613e6d-d94d-4b2d-939f-b44b29e32a14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.554 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[d90dc922-97ff-4ffb-96dd-efb976e8b6b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.568 141754 DEBUG oslo.privsep.daemon [-] privsep: reply[698b46b5-a134-47dd-be7b-52bc58a19e7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.597 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[06a82a12-990b-40fa-8667-7d53fa1b96ae]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.606 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.613 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:27 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:27Z|00065|binding|INFO|Setting lport 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 ovn-installed in OVS
Dec  6 05:14:27 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:27Z|00066|binding|INFO|Setting lport 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 up in Southbound
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.624 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.630 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[9c3cbf50-8715-4ae7-a244-d1bd75495aa2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.635 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[4f1ab3f4-732e-4637-a75c-7ffb7f9ee04a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 NetworkManager[48884]: <info>  [1765016067.6368] manager: (tapef8aaff1-00): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.667 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[6b6bb029-9499-4aa0-968e-536899bd4309]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.669 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[78e1361e-7b5b-4f4e-904e-0f9c8a012fb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:27 np0005548918 NetworkManager[48884]: <info>  [1765016067.6880] device (tapef8aaff1-00): carrier: link connected
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.695 233220 DEBUG oslo.privsep.daemon [-] privsep: reply[fb23b0cc-b08b-4eb2-a3c0-be3add394cd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.708 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[a63eff87-2319-4bda-a38b-8cc6c441f418]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef8aaff1-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e6:e2:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444129, 'reachable_time': 41630, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240441, 'error': None, 'target': 'ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.723 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[602e3d5c-5083-4e7c-a033-bbac9e5e012b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee6:e290'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444129, 'tstamp': 444129}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240442, 'error': None, 'target': 'ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.738 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[fe8b4579-d9fd-4608-afd0-36d9df3f43c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef8aaff1-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e6:e2:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444129, 'reachable_time': 41630, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240450, 'error': None, 'target': 'ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.763 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[50c5500c-75f9-47fb-8be9-8ab6c2800462]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:27 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a90000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.827 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[486f07e9-86e6-40e2-a51a-8e5133c85d4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.829 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef8aaff1-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.829 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.830 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef8aaff1-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.832 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:27 np0005548918 NetworkManager[48884]: <info>  [1765016067.8328] manager: (tapef8aaff1-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Dec  6 05:14:27 np0005548918 kernel: tapef8aaff1-00: entered promiscuous mode
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.835 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.836 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapef8aaff1-00, col_values=(('external_ids', {'iface-id': '6e1dcf71-e1ba-45b9-bb6f-63d6dce249f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.837 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:27 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:27Z|00067|binding|INFO|Releasing lport 6e1dcf71-e1ba-45b9-bb6f-63d6dce249f2 from this chassis (sb_readonly=0)
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.840 141640 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ef8aaff1-03b0-4544-89c9-035c25f01e5c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ef8aaff1-03b0-4544-89c9-035c25f01e5c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.840 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[178cddd1-c6b8-45e2-baf4-89e952e08252]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.841 141640 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: global
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    log         /dev/log local0 debug
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    log-tag     haproxy-metadata-proxy-ef8aaff1-03b0-4544-89c9-035c25f01e5c
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    user        root
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    group       root
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    maxconn     1024
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    pidfile     /var/lib/neutron/external/pids/ef8aaff1-03b0-4544-89c9-035c25f01e5c.pid.haproxy
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    daemon
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: defaults
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    log global
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    mode http
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    option httplog
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    option dontlognull
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    option http-server-close
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    option forwardfor
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    retries                 3
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    timeout http-request    30s
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    timeout connect         30s
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    timeout client          32s
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    timeout server          32s
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    timeout http-keep-alive 30s
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: listen listener
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    bind 169.254.169.254:80
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    server metadata /var/lib/neutron/metadata_proxy
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]:    http-request add-header X-OVN-Network-ID ef8aaff1-03b0-4544-89c9-035c25f01e5c
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  6 05:14:27 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:27.842 141640 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c', 'env', 'PROCESS_TAG=haproxy-ef8aaff1-03b0-4544-89c9-035c25f01e5c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ef8aaff1-03b0-4544-89c9-035c25f01e5c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.850 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:27 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.879 229250 DEBUG nova.virt.driver [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Emitting event <LifecycleEvent: 1765016067.879353, 8ec1d789-8c20-4dd3-a44b-5565d1293cea => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.880 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] VM Started (Lifecycle Event)#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.909 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.912 229250 DEBUG nova.virt.driver [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Emitting event <LifecycleEvent: 1765016067.8795152, 8ec1d789-8c20-4dd3-a44b-5565d1293cea => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.912 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] VM Paused (Lifecycle Event)#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.936 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.938 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  6 05:14:27 np0005548918 nova_compute[229246]: 2025-12-06 10:14:27.998 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  6 05:14:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:28 np0005548918 podman[240517]: 2025-12-06 10:14:28.240078074 +0000 UTC m=+0.054629392 container create 9aaf969c14bf1e473af88e90a54ec92a112df212f2143788ea541a9c2ac8bf3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  6 05:14:28 np0005548918 systemd[1]: Started libpod-conmon-9aaf969c14bf1e473af88e90a54ec92a112df212f2143788ea541a9c2ac8bf3f.scope.
Dec  6 05:14:28 np0005548918 podman[240517]: 2025-12-06 10:14:28.210231975 +0000 UTC m=+0.024783293 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3
Dec  6 05:14:28 np0005548918 systemd[1]: Started libcrun container.
Dec  6 05:14:28 np0005548918 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90cf1135e5602e8670bea1a227bb9b7e1b0a5afde213fea922937aef6c5f079e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  6 05:14:28 np0005548918 podman[240517]: 2025-12-06 10:14:28.331019817 +0000 UTC m=+0.145571165 container init 9aaf969c14bf1e473af88e90a54ec92a112df212f2143788ea541a9c2ac8bf3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  6 05:14:28 np0005548918 podman[240517]: 2025-12-06 10:14:28.335691432 +0000 UTC m=+0.150242750 container start 9aaf969c14bf1e473af88e90a54ec92a112df212f2143788ea541a9c2ac8bf3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec  6 05:14:28 np0005548918 neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c[240532]: [NOTICE]   (240536) : New worker (240538) forked
Dec  6 05:14:28 np0005548918 neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c[240532]: [NOTICE]   (240536) : Loading success.
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.530 229250 DEBUG nova.compute.manager [req-ec5a0586-07aa-4de3-9dc3-fe75d6fbaf5b req-4370c8fd-a173-4518-ab4e-068af98952b2 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received event network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.531 229250 DEBUG oslo_concurrency.lockutils [req-ec5a0586-07aa-4de3-9dc3-fe75d6fbaf5b req-4370c8fd-a173-4518-ab4e-068af98952b2 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.531 229250 DEBUG oslo_concurrency.lockutils [req-ec5a0586-07aa-4de3-9dc3-fe75d6fbaf5b req-4370c8fd-a173-4518-ab4e-068af98952b2 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.531 229250 DEBUG oslo_concurrency.lockutils [req-ec5a0586-07aa-4de3-9dc3-fe75d6fbaf5b req-4370c8fd-a173-4518-ab4e-068af98952b2 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.531 229250 DEBUG nova.compute.manager [req-ec5a0586-07aa-4de3-9dc3-fe75d6fbaf5b req-4370c8fd-a173-4518-ab4e-068af98952b2 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Processing event network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.532 229250 DEBUG nova.compute.manager [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.536 229250 DEBUG nova.virt.driver [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] Emitting event <LifecycleEvent: 1765016068.535947, 8ec1d789-8c20-4dd3-a44b-5565d1293cea => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.537 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] VM Resumed (Lifecycle Event)#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.539 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.543 229250 INFO nova.virt.libvirt.driver [-] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Instance spawned successfully.#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.544 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.569 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.577 229250 DEBUG nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.584 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.585 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.585 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.586 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.586 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.587 229250 DEBUG nova.virt.libvirt.driver [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.616 229250 INFO nova.compute.manager [None req-1483642e-e799-4527-888e-e66f56340dc6 - - - - - -] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.661 229250 INFO nova.compute.manager [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Took 6.67 seconds to spawn the instance on the hypervisor.#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.662 229250 DEBUG nova.compute.manager [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:14:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.729 229250 INFO nova.compute.manager [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Took 7.58 seconds to build instance.#033[00m
Dec  6 05:14:28 np0005548918 nova_compute[229246]: 2025-12-06 10:14:28.746 229250 DEBUG oslo_concurrency.lockutils [None req-8b49c184-6462-4290-a293-b5fd2d05c045 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:28 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  6 05:14:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:28 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  6 05:14:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:28.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:29.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:29 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:29 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:29 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a90001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:30 np0005548918 nova_compute[229246]: 2025-12-06 10:14:30.232 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:30 np0005548918 nova_compute[229246]: 2025-12-06 10:14:30.633 229250 DEBUG nova.compute.manager [req-30592830-b3cc-4017-b154-4e1e9465ad5e req-f237c8f5-b222-4eaf-b284-9219162755e3 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received event network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:14:30 np0005548918 nova_compute[229246]: 2025-12-06 10:14:30.634 229250 DEBUG oslo_concurrency.lockutils [req-30592830-b3cc-4017-b154-4e1e9465ad5e req-f237c8f5-b222-4eaf-b284-9219162755e3 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:30 np0005548918 nova_compute[229246]: 2025-12-06 10:14:30.634 229250 DEBUG oslo_concurrency.lockutils [req-30592830-b3cc-4017-b154-4e1e9465ad5e req-f237c8f5-b222-4eaf-b284-9219162755e3 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:30 np0005548918 nova_compute[229246]: 2025-12-06 10:14:30.634 229250 DEBUG oslo_concurrency.lockutils [req-30592830-b3cc-4017-b154-4e1e9465ad5e req-f237c8f5-b222-4eaf-b284-9219162755e3 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:30 np0005548918 nova_compute[229246]: 2025-12-06 10:14:30.634 229250 DEBUG nova.compute.manager [req-30592830-b3cc-4017-b154-4e1e9465ad5e req-f237c8f5-b222-4eaf-b284-9219162755e3 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] No waiting events found dispatching network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:14:30 np0005548918 nova_compute[229246]: 2025-12-06 10:14:30.635 229250 WARNING nova.compute.manager [req-30592830-b3cc-4017-b154-4e1e9465ad5e req-f237c8f5-b222-4eaf-b284-9219162755e3 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received unexpected event network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 for instance with vm_state active and task_state None.#033[00m
Dec  6 05:14:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:14:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:30.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:14:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:14:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:31.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:14:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:31 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:31 np0005548918 podman[240550]: 2025-12-06 10:14:31.219265072 +0000 UTC m=+0.105323588 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  6 05:14:31 np0005548918 nova_compute[229246]: 2025-12-06 10:14:31.602 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:31 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:31 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  6 05:14:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:31 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:14:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:32.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:14:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:14:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:33.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:14:33 np0005548918 nova_compute[229246]: 2025-12-06 10:14:33.189 229250 DEBUG nova.compute.manager [req-e2a65522-896f-4a2b-9b01-746caa9bd34a req-e90de0cf-f996-48c3-ac6b-1d361f6e0298 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received event network-changed-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:14:33 np0005548918 nova_compute[229246]: 2025-12-06 10:14:33.189 229250 DEBUG nova.compute.manager [req-e2a65522-896f-4a2b-9b01-746caa9bd34a req-e90de0cf-f996-48c3-ac6b-1d361f6e0298 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Refreshing instance network info cache due to event network-changed-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  6 05:14:33 np0005548918 nova_compute[229246]: 2025-12-06 10:14:33.189 229250 DEBUG oslo_concurrency.lockutils [req-e2a65522-896f-4a2b-9b01-746caa9bd34a req-e90de0cf-f996-48c3-ac6b-1d361f6e0298 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:14:33 np0005548918 nova_compute[229246]: 2025-12-06 10:14:33.190 229250 DEBUG oslo_concurrency.lockutils [req-e2a65522-896f-4a2b-9b01-746caa9bd34a req-e90de0cf-f996-48c3-ac6b-1d361f6e0298 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquired lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:14:33 np0005548918 nova_compute[229246]: 2025-12-06 10:14:33.190 229250 DEBUG nova.network.neutron [req-e2a65522-896f-4a2b-9b01-746caa9bd34a req-e90de0cf-f996-48c3-ac6b-1d361f6e0298 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Refreshing network info cache for port 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  6 05:14:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:33 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a90001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:33 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:33 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:34 np0005548918 nova_compute[229246]: 2025-12-06 10:14:34.159 229250 DEBUG nova.network.neutron [req-e2a65522-896f-4a2b-9b01-746caa9bd34a req-e90de0cf-f996-48c3-ac6b-1d361f6e0298 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Updated VIF entry in instance network info cache for port 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  6 05:14:34 np0005548918 nova_compute[229246]: 2025-12-06 10:14:34.160 229250 DEBUG nova.network.neutron [req-e2a65522-896f-4a2b-9b01-746caa9bd34a req-e90de0cf-f996-48c3-ac6b-1d361f6e0298 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Updating instance_info_cache with network_info: [{"id": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "address": "fa:16:3e:ac:c2:fe", "network": {"id": "ef8aaff1-03b0-4544-89c9-035c25f01e5c", "bridge": "br-int", "label": "tempest-network-smoke--1887948682", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap887ea51a-ea", "ovs_interfaceid": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:14:34 np0005548918 nova_compute[229246]: 2025-12-06 10:14:34.190 229250 DEBUG oslo_concurrency.lockutils [req-e2a65522-896f-4a2b-9b01-746caa9bd34a req-e90de0cf-f996-48c3-ac6b-1d361f6e0298 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Releasing lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:14:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:14:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:34.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:14:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:14:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:35.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:14:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:35 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:35 np0005548918 nova_compute[229246]: 2025-12-06 10:14:35.233 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:35 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a90001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:35 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:36 np0005548918 nova_compute[229246]: 2025-12-06 10:14:36.605 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:36.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:14:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:37.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:14:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:37 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac80091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:37 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/101437 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  6 05:14:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:37 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a90002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:14:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:38.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:14:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:39.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:39 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:39 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac80091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:39 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:40 np0005548918 nova_compute[229246]: 2025-12-06 10:14:40.236 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:14:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:40.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:14:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:41.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:41 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a90002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:41 np0005548918 nova_compute[229246]: 2025-12-06 10:14:41.609 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:41 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:41 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:41Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ac:c2:fe 10.100.0.8
Dec  6 05:14:41 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:41Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ac:c2:fe 10.100.0.8
Dec  6 05:14:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:41 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:42.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:43.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:43 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:43 np0005548918 nova_compute[229246]: 2025-12-06 10:14:43.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:43 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:43 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:44.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:14:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:45.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:14:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:45 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:45 np0005548918 nova_compute[229246]: 2025-12-06 10:14:45.237 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:45 np0005548918 nova_compute[229246]: 2025-12-06 10:14:45.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:45 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a900039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:45 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a900039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  6 05:14:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3418864736' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  6 05:14:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  6 05:14:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3418864736' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  6 05:14:46 np0005548918 nova_compute[229246]: 2025-12-06 10:14:46.611 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:46.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:47.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:47 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:47 np0005548918 nova_compute[229246]: 2025-12-06 10:14:47.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:47 np0005548918 nova_compute[229246]: 2025-12-06 10:14:47.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:47 np0005548918 nova_compute[229246]: 2025-12-06 10:14:47.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:14:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:47 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:47 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a900039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:48 np0005548918 podman[240615]: 2025-12-06 10:14:48.292256576 +0000 UTC m=+0.174191202 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Dec  6 05:14:48 np0005548918 nova_compute[229246]: 2025-12-06 10:14:48.531 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:48.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:49.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:49 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:49 np0005548918 nova_compute[229246]: 2025-12-06 10:14:49.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:49 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:49 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:50 np0005548918 nova_compute[229246]: 2025-12-06 10:14:50.240 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:51.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:51.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:51 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a900039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:51 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:51.240 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:14:51 np0005548918 nova_compute[229246]: 2025-12-06 10:14:51.240 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:51 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:51.242 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:14:51 np0005548918 nova_compute[229246]: 2025-12-06 10:14:51.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:51 np0005548918 nova_compute[229246]: 2025-12-06 10:14:51.613 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:51 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a900039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:51 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:52 np0005548918 nova_compute[229246]: 2025-12-06 10:14:52.531 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:52 np0005548918 nova_compute[229246]: 2025-12-06 10:14:52.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:52 np0005548918 nova_compute[229246]: 2025-12-06 10:14:52.535 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:14:52 np0005548918 nova_compute[229246]: 2025-12-06 10:14:52.535 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:14:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:52 np0005548918 nova_compute[229246]: 2025-12-06 10:14:52.811 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:14:52 np0005548918 nova_compute[229246]: 2025-12-06 10:14:52.812 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquired lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:14:52 np0005548918 nova_compute[229246]: 2025-12-06 10:14:52.812 229250 DEBUG nova.network.neutron [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  6 05:14:52 np0005548918 nova_compute[229246]: 2025-12-06 10:14:52.812 229250 DEBUG nova.objects.instance [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8ec1d789-8c20-4dd3-a44b-5565d1293cea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  6 05:14:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:53.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:53.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:53 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:53 np0005548918 podman[240775]: 2025-12-06 10:14:53.388793226 +0000 UTC m=+0.073080926 container exec 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Dec  6 05:14:53 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  6 05:14:53 np0005548918 podman[240775]: 2025-12-06 10:14:53.502721495 +0000 UTC m=+0.187009205 container exec_died 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Dec  6 05:14:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:53.682 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:53.683 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:53.684 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:53 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a900039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:53 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:54 np0005548918 podman[240916]: 2025-12-06 10:14:54.10504601 +0000 UTC m=+0.071875344 container exec 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 05:14:54 np0005548918 podman[240916]: 2025-12-06 10:14:54.115572292 +0000 UTC m=+0.082401616 container exec_died 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 05:14:54 np0005548918 podman[240975]: 2025-12-06 10:14:54.332945087 +0000 UTC m=+0.064860796 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:14:54 np0005548918 podman[241029]: 2025-12-06 10:14:54.467122947 +0000 UTC m=+0.058465525 container exec e0ec4d7cccdd0c825989c42c82ae89eddf385a57b9308109ee802c61948bb4ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:14:54 np0005548918 podman[241029]: 2025-12-06 10:14:54.479409327 +0000 UTC m=+0.070751885 container exec_died e0ec4d7cccdd0c825989c42c82ae89eddf385a57b9308109ee802c61948bb4ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  6 05:14:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:54 np0005548918 podman[241096]: 2025-12-06 10:14:54.692154489 +0000 UTC m=+0.064062035 container exec 291e33d7558df1250bc1d75586903aba6000ccad9dd3cb120f4999944db31c98 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna)
Dec  6 05:14:54 np0005548918 nova_compute[229246]: 2025-12-06 10:14:54.694 229250 DEBUG nova.network.neutron [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Updating instance_info_cache with network_info: [{"id": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "address": "fa:16:3e:ac:c2:fe", "network": {"id": "ef8aaff1-03b0-4544-89c9-035c25f01e5c", "bridge": "br-int", "label": "tempest-network-smoke--1887948682", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap887ea51a-ea", "ovs_interfaceid": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:14:54 np0005548918 podman[241096]: 2025-12-06 10:14:54.70753596 +0000 UTC m=+0.079443476 container exec_died 291e33d7558df1250bc1d75586903aba6000ccad9dd3cb120f4999944db31c98 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna)
Dec  6 05:14:54 np0005548918 nova_compute[229246]: 2025-12-06 10:14:54.709 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Releasing lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:14:54 np0005548918 nova_compute[229246]: 2025-12-06 10:14:54.710 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  6 05:14:54 np0005548918 nova_compute[229246]: 2025-12-06 10:14:54.710 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:14:54 np0005548918 nova_compute[229246]: 2025-12-06 10:14:54.736 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:54 np0005548918 nova_compute[229246]: 2025-12-06 10:14:54.736 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:54 np0005548918 nova_compute[229246]: 2025-12-06 10:14:54.736 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:54 np0005548918 nova_compute[229246]: 2025-12-06 10:14:54.736 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:14:54 np0005548918 nova_compute[229246]: 2025-12-06 10:14:54.736 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:14:54 np0005548918 podman[241180]: 2025-12-06 10:14:54.973375903 +0000 UTC m=+0.065164945 container exec cbcabdb9b139bf7198b15438accb8f4a51fb667fdf4f19be3cdf7b28a8213220 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg, com.redhat.component=keepalived-container, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, vcs-type=git, version=2.2.4, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec  6 05:14:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:55.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:55 np0005548918 podman[241180]: 2025-12-06 10:14:55.014577875 +0000 UTC m=+0.106366917 container exec_died cbcabdb9b139bf7198b15438accb8f4a51fb667fdf4f19be3cdf7b28a8213220 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, vcs-type=git, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., version=2.2.4, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, name=keepalived, com.redhat.component=keepalived-container)
Dec  6 05:14:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:55.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:14:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:14:55 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3783245906' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.180 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:14:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:55 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.274 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.277 229250 DEBUG nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] skipping disk for instance-0000000c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.278 229250 DEBUG nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] skipping disk for instance-0000000c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.460 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.463 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4568MB free_disk=59.897186279296875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.463 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.464 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.736 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Instance 8ec1d789-8c20-4dd3-a44b-5565d1293cea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.737 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.738 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.784 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.804 229250 DEBUG oslo_concurrency.lockutils [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.805 229250 DEBUG oslo_concurrency.lockutils [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.806 229250 DEBUG oslo_concurrency.lockutils [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.806 229250 DEBUG oslo_concurrency.lockutils [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.806 229250 DEBUG oslo_concurrency.lockutils [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.808 229250 INFO nova.compute.manager [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Terminating instance#033[00m
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.810 229250 DEBUG nova.compute.manager [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  6 05:14:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:55 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:55 np0005548918 kernel: tap887ea51a-ea (unregistering): left promiscuous mode
Dec  6 05:14:55 np0005548918 NetworkManager[48884]: <info>  [1765016095.8612] device (tap887ea51a-ea): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.888 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:55 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:55Z|00068|binding|INFO|Releasing lport 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 from this chassis (sb_readonly=0)
Dec  6 05:14:55 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:55Z|00069|binding|INFO|Setting lport 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 down in Southbound
Dec  6 05:14:55 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:55Z|00070|binding|INFO|Removing iface tap887ea51a-ea ovn-installed in OVS
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.892 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:55 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:14:55 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:14:55 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:14:55 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:14:55 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  6 05:14:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:55.898 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:c2:fe 10.100.0.8'], port_security=['fa:16:3e:ac:c2:fe 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8ec1d789-8c20-4dd3-a44b-5565d1293cea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef8aaff1-03b0-4544-89c9-035c25f01e5c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92b402c8d3e2476abc98be42a1e6d34e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '62f9558a-9b40-40ff-98c1-c5a43bf7599d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1a37e6e-1014-49d4-9543-ee1567988851, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], logical_port=887ea51a-eae0-4aca-80e0-9a3c0d8b60b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:14:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:55.900 141640 INFO neutron.agent.ovn.metadata.agent [-] Port 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 in datapath ef8aaff1-03b0-4544-89c9-035c25f01e5c unbound from our chassis#033[00m
Dec  6 05:14:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:55.901 141640 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ef8aaff1-03b0-4544-89c9-035c25f01e5c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  6 05:14:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:55.902 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[f4b59e6c-977a-4d91-a8c1-f55d597400a0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:55 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:55.903 141640 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c namespace which is not needed anymore#033[00m
Dec  6 05:14:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:55 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a900039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:55 np0005548918 nova_compute[229246]: 2025-12-06 10:14:55.928 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:55 np0005548918 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec  6 05:14:55 np0005548918 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000c.scope: Consumed 13.328s CPU time.
Dec  6 05:14:55 np0005548918 systemd-machined[192688]: Machine qemu-4-instance-0000000c terminated.
Dec  6 05:14:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:56 np0005548918 kernel: tap887ea51a-ea: entered promiscuous mode
Dec  6 05:14:56 np0005548918 NetworkManager[48884]: <info>  [1765016096.0319] manager: (tap887ea51a-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Dec  6 05:14:56 np0005548918 kernel: tap887ea51a-ea (unregistering): left promiscuous mode
Dec  6 05:14:56 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:56Z|00071|binding|INFO|Claiming lport 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 for this chassis.
Dec  6 05:14:56 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:56Z|00072|binding|INFO|887ea51a-eae0-4aca-80e0-9a3c0d8b60b7: Claiming fa:16:3e:ac:c2:fe 10.100.0.8
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.036 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.053 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:c2:fe 10.100.0.8'], port_security=['fa:16:3e:ac:c2:fe 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8ec1d789-8c20-4dd3-a44b-5565d1293cea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef8aaff1-03b0-4544-89c9-035c25f01e5c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92b402c8d3e2476abc98be42a1e6d34e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '62f9558a-9b40-40ff-98c1-c5a43bf7599d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1a37e6e-1014-49d4-9543-ee1567988851, chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], logical_port=887ea51a-eae0-4aca-80e0-9a3c0d8b60b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.053 229250 INFO nova.virt.libvirt.driver [-] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Instance destroyed successfully.#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.053 229250 DEBUG nova.objects.instance [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lazy-loading 'resources' on Instance uuid 8ec1d789-8c20-4dd3-a44b-5565d1293cea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  6 05:14:56 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:56Z|00073|binding|INFO|Setting lport 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 ovn-installed in OVS
Dec  6 05:14:56 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:56Z|00074|binding|INFO|Setting lport 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 up in Southbound
Dec  6 05:14:56 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:56Z|00075|binding|INFO|Releasing lport 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 from this chassis (sb_readonly=1)
Dec  6 05:14:56 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:56Z|00076|if_status|INFO|Not setting lport 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 down as sb is readonly
Dec  6 05:14:56 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:56Z|00077|binding|INFO|Removing iface tap887ea51a-ea ovn-installed in OVS
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.056 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:56 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:56Z|00078|binding|INFO|Releasing lport 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 from this chassis (sb_readonly=0)
Dec  6 05:14:56 np0005548918 ovn_controller[132371]: 2025-12-06T10:14:56Z|00079|binding|INFO|Setting lport 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 down in Southbound
Dec  6 05:14:56 np0005548918 neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c[240532]: [NOTICE]   (240536) : haproxy version is 2.8.14-c23fe91
Dec  6 05:14:56 np0005548918 neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c[240532]: [NOTICE]   (240536) : path to executable is /usr/sbin/haproxy
Dec  6 05:14:56 np0005548918 neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c[240532]: [WARNING]  (240536) : Exiting Master process...
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.073 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.076 229250 DEBUG nova.virt.libvirt.vif [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-06T10:14:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2062141463',display_name='tempest-TestNetworkBasicOps-server-2062141463',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2062141463',id=12,image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDiXUgLqzE4HhCWF2FxGckt3FoPTmhlzqmtYr2BE7YoZEjAsKDyP7+VvUeKn/wEuGf5swQ9zHFbCj1Cz5Vthaw3uQeZQm81Uuov1u5HA4+nHd2yE4AljdVzf3CoWLHUdng==',key_name='tempest-TestNetworkBasicOps-875173470',keypairs=<?>,launch_index=0,launched_at=2025-12-06T10:14:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='92b402c8d3e2476abc98be42a1e6d34e',ramdisk_id='',reservation_id='r-w3v180si',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9489b8a5-a798-4e26-87f9-59bb1eb2e6fd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1971100882',owner_user_name='tempest-TestNetworkBasicOps-1971100882-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T10:14:28Z,user_data=None,user_id='03615580775245e6ae335ee9d785611f',uuid=8ec1d789-8c20-4dd3-a44b-5565d1293cea,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "address": "fa:16:3e:ac:c2:fe", "network": {"id": "ef8aaff1-03b0-4544-89c9-035c25f01e5c", "bridge": "br-int", "label": "tempest-network-smoke--1887948682", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap887ea51a-ea", "ovs_interfaceid": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.077 229250 DEBUG nova.network.os_vif_util [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converting VIF {"id": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "address": "fa:16:3e:ac:c2:fe", "network": {"id": "ef8aaff1-03b0-4544-89c9-035c25f01e5c", "bridge": "br-int", "label": "tempest-network-smoke--1887948682", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap887ea51a-ea", "ovs_interfaceid": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  6 05:14:56 np0005548918 neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c[240532]: [ALERT]    (240536) : Current worker (240538) exited with code 143 (Terminated)
Dec  6 05:14:56 np0005548918 neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c[240532]: [WARNING]  (240536) : All workers exited. Exiting... (0)
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.078 229250 DEBUG nova.network.os_vif_util [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ac:c2:fe,bridge_name='br-int',has_traffic_filtering=True,id=887ea51a-eae0-4aca-80e0-9a3c0d8b60b7,network=Network(ef8aaff1-03b0-4544-89c9-035c25f01e5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap887ea51a-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.079 229250 DEBUG os_vif [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:c2:fe,bridge_name='br-int',has_traffic_filtering=True,id=887ea51a-eae0-4aca-80e0-9a3c0d8b60b7,network=Network(ef8aaff1-03b0-4544-89c9-035c25f01e5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap887ea51a-ea') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  6 05:14:56 np0005548918 systemd[1]: libpod-9aaf969c14bf1e473af88e90a54ec92a112df212f2143788ea541a9c2ac8bf3f.scope: Deactivated successfully.
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.081 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:c2:fe 10.100.0.8'], port_security=['fa:16:3e:ac:c2:fe 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8ec1d789-8c20-4dd3-a44b-5565d1293cea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef8aaff1-03b0-4544-89c9-035c25f01e5c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92b402c8d3e2476abc98be42a1e6d34e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '62f9558a-9b40-40ff-98c1-c5a43bf7599d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1a37e6e-1014-49d4-9543-ee1567988851, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>], logical_port=887ea51a-eae0-4aca-80e0-9a3c0d8b60b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fef907d3af0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.082 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.083 229250 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap887ea51a-ea, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.084 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:56 np0005548918 podman[241374]: 2025-12-06 10:14:56.086651128 +0000 UTC m=+0.079344453 container died 9aaf969c14bf1e473af88e90a54ec92a112df212f2143788ea541a9c2ac8bf3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.087 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.090 229250 INFO os_vif [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:c2:fe,bridge_name='br-int',has_traffic_filtering=True,id=887ea51a-eae0-4aca-80e0-9a3c0d8b60b7,network=Network(ef8aaff1-03b0-4544-89c9-035c25f01e5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap887ea51a-ea')#033[00m
Dec  6 05:14:56 np0005548918 systemd[1]: var-lib-containers-storage-overlay-90cf1135e5602e8670bea1a227bb9b7e1b0a5afde213fea922937aef6c5f079e-merged.mount: Deactivated successfully.
Dec  6 05:14:56 np0005548918 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9aaf969c14bf1e473af88e90a54ec92a112df212f2143788ea541a9c2ac8bf3f-userdata-shm.mount: Deactivated successfully.
Dec  6 05:14:56 np0005548918 podman[241374]: 2025-12-06 10:14:56.12859596 +0000 UTC m=+0.121289265 container cleanup 9aaf969c14bf1e473af88e90a54ec92a112df212f2143788ea541a9c2ac8bf3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  6 05:14:56 np0005548918 systemd[1]: libpod-conmon-9aaf969c14bf1e473af88e90a54ec92a112df212f2143788ea541a9c2ac8bf3f.scope: Deactivated successfully.
Dec  6 05:14:56 np0005548918 podman[241430]: 2025-12-06 10:14:56.191132684 +0000 UTC m=+0.042688314 container remove 9aaf969c14bf1e473af88e90a54ec92a112df212f2143788ea541a9c2ac8bf3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.197 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[e7f7b56e-d6b4-4805-8f43-acc564008341]: (4, ('Sat Dec  6 10:14:55 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c (9aaf969c14bf1e473af88e90a54ec92a112df212f2143788ea541a9c2ac8bf3f)\n9aaf969c14bf1e473af88e90a54ec92a112df212f2143788ea541a9c2ac8bf3f\nSat Dec  6 10:14:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c (9aaf969c14bf1e473af88e90a54ec92a112df212f2143788ea541a9c2ac8bf3f)\n9aaf969c14bf1e473af88e90a54ec92a112df212f2143788ea541a9c2ac8bf3f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.199 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[60ee969d-ea16-4a52-b6da-30ff5c1e2093]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.200 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef8aaff1-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.201 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:56 np0005548918 kernel: tapef8aaff1-00: left promiscuous mode
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.222 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.226 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[01ae648b-0e32-4aed-8c0b-aed2ce613d65]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.239 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[d9690760-1adc-4918-abb6-57fc09945c33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.240 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[074f2d2f-e9a8-490d-b33a-e094e71ee634]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.254 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[f60e7226-fb73-4f41-8820-4aaa84be93ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444123, 'reachable_time': 25947, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241447, 'error': None, 'target': 'ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.257 141754 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ef8aaff1-03b0-4544-89c9-035c25f01e5c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  6 05:14:56 np0005548918 systemd[1]: run-netns-ovnmeta\x2def8aaff1\x2d03b0\x2d4544\x2d89c9\x2d035c25f01e5c.mount: Deactivated successfully.
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.257 141754 DEBUG oslo.privsep.daemon [-] privsep: reply[a5308171-79fc-43f0-9d85-b5b63a3ad258]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.258 141640 INFO neutron.agent.ovn.metadata.agent [-] Port 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 in datapath ef8aaff1-03b0-4544-89c9-035c25f01e5c unbound from our chassis#033[00m
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.259 141640 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ef8aaff1-03b0-4544-89c9-035c25f01e5c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.260 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[d6af47c1-d4fa-40d8-b383-abab2bbb07d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.261 141640 INFO neutron.agent.ovn.metadata.agent [-] Port 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 in datapath ef8aaff1-03b0-4544-89c9-035c25f01e5c unbound from our chassis#033[00m
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.262 141640 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ef8aaff1-03b0-4544-89c9-035c25f01e5c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  6 05:14:56 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:56.262 233203 DEBUG oslo.privsep.daemon [-] privsep: reply[a8b8dae3-2940-4e17-aa9f-de70b980bf34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.269 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.275 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.292 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.319 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.319 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.856s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.407 229250 DEBUG nova.compute.manager [req-6e06c379-efe2-40d2-a78c-761adddc0b19 req-e95df7f8-4f35-4707-8b1c-95dec52e9fcc d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received event network-changed-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.408 229250 DEBUG nova.compute.manager [req-6e06c379-efe2-40d2-a78c-761adddc0b19 req-e95df7f8-4f35-4707-8b1c-95dec52e9fcc d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Refreshing instance network info cache due to event network-changed-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.408 229250 DEBUG oslo_concurrency.lockutils [req-6e06c379-efe2-40d2-a78c-761adddc0b19 req-e95df7f8-4f35-4707-8b1c-95dec52e9fcc d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.409 229250 DEBUG oslo_concurrency.lockutils [req-6e06c379-efe2-40d2-a78c-761adddc0b19 req-e95df7f8-4f35-4707-8b1c-95dec52e9fcc d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquired lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.410 229250 DEBUG nova.network.neutron [req-6e06c379-efe2-40d2-a78c-761adddc0b19 req-e95df7f8-4f35-4707-8b1c-95dec52e9fcc d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Refreshing network info cache for port 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.451 229250 INFO nova.virt.libvirt.driver [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Deleting instance files /var/lib/nova/instances/8ec1d789-8c20-4dd3-a44b-5565d1293cea_del#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.452 229250 INFO nova.virt.libvirt.driver [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Deletion of /var/lib/nova/instances/8ec1d789-8c20-4dd3-a44b-5565d1293cea_del complete#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.523 229250 INFO nova.compute.manager [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Took 0.71 seconds to destroy the instance on the hypervisor.#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.524 229250 DEBUG oslo.service.loopingcall [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.524 229250 DEBUG nova.compute.manager [-] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.524 229250 DEBUG nova.network.neutron [-] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  6 05:14:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.869 229250 DEBUG nova.compute.manager [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received event network-vif-unplugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.869 229250 DEBUG oslo_concurrency.lockutils [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.869 229250 DEBUG oslo_concurrency.lockutils [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.870 229250 DEBUG oslo_concurrency.lockutils [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.870 229250 DEBUG nova.compute.manager [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] No waiting events found dispatching network-vif-unplugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.870 229250 DEBUG nova.compute.manager [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received event network-vif-unplugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.870 229250 DEBUG nova.compute.manager [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received event network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.870 229250 DEBUG oslo_concurrency.lockutils [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.871 229250 DEBUG oslo_concurrency.lockutils [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.871 229250 DEBUG oslo_concurrency.lockutils [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.871 229250 DEBUG nova.compute.manager [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] No waiting events found dispatching network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.871 229250 WARNING nova.compute.manager [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received unexpected event network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 for instance with vm_state active and task_state deleting.#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.872 229250 DEBUG nova.compute.manager [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received event network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.872 229250 DEBUG oslo_concurrency.lockutils [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.872 229250 DEBUG oslo_concurrency.lockutils [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.872 229250 DEBUG oslo_concurrency.lockutils [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.872 229250 DEBUG nova.compute.manager [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] No waiting events found dispatching network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.873 229250 WARNING nova.compute.manager [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received unexpected event network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 for instance with vm_state active and task_state deleting.#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.873 229250 DEBUG nova.compute.manager [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received event network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.873 229250 DEBUG oslo_concurrency.lockutils [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.873 229250 DEBUG oslo_concurrency.lockutils [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.873 229250 DEBUG oslo_concurrency.lockutils [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.873 229250 DEBUG nova.compute.manager [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] No waiting events found dispatching network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:14:56 np0005548918 nova_compute[229246]: 2025-12-06 10:14:56.874 229250 WARNING nova.compute.manager [req-9c549673-eeae-4939-b790-a9a3b2779cb6 req-34e22add-00fe-4fdc-9ad4-05f97234c2d6 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received unexpected event network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 for instance with vm_state active and task_state deleting.#033[00m
Dec  6 05:14:56 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  6 05:14:56 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:14:56 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:14:56 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:14:56 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:14:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:57.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:14:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:57.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:14:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:57 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa80014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:57 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:14:57.243 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:14:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:57 np0005548918 nova_compute[229246]: 2025-12-06 10:14:57.718 229250 DEBUG nova.network.neutron [-] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:14:57 np0005548918 nova_compute[229246]: 2025-12-06 10:14:57.738 229250 INFO nova.compute.manager [-] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Took 1.21 seconds to deallocate network for instance.#033[00m
Dec  6 05:14:57 np0005548918 nova_compute[229246]: 2025-12-06 10:14:57.798 229250 DEBUG oslo_concurrency.lockutils [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:57 np0005548918 nova_compute[229246]: 2025-12-06 10:14:57.798 229250 DEBUG oslo_concurrency.lockutils [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:57 np0005548918 nova_compute[229246]: 2025-12-06 10:14:57.831 229250 DEBUG oslo_concurrency.processutils [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:14:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:57 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:57 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:14:58 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2727823202' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.319 229250 DEBUG oslo_concurrency.processutils [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.325 229250 DEBUG nova.compute.provider_tree [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.345 229250 DEBUG nova.scheduler.client.report [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.374 229250 DEBUG oslo_concurrency.lockutils [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.428 229250 INFO nova.scheduler.client.report [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Deleted allocations for instance 8ec1d789-8c20-4dd3-a44b-5565d1293cea#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.511 229250 DEBUG nova.compute.manager [req-4c968c75-9e2c-4aad-a67c-82a26a1604d3 req-fc5406c1-180c-4f55-9708-bd9e675c30a1 d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received event network-vif-deleted-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.517 229250 DEBUG oslo_concurrency.lockutils [None req-4dc70ea9-1dd2-46da-ae68-5ddfd4d461aa 03615580775245e6ae335ee9d785611f 92b402c8d3e2476abc98be42a1e6d34e - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.712s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.799 229250 DEBUG nova.network.neutron [req-6e06c379-efe2-40d2-a78c-761adddc0b19 req-e95df7f8-4f35-4707-8b1c-95dec52e9fcc d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Updated VIF entry in instance network info cache for port 887ea51a-eae0-4aca-80e0-9a3c0d8b60b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.799 229250 DEBUG nova.network.neutron [req-6e06c379-efe2-40d2-a78c-761adddc0b19 req-e95df7f8-4f35-4707-8b1c-95dec52e9fcc d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Updating instance_info_cache with network_info: [{"id": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "address": "fa:16:3e:ac:c2:fe", "network": {"id": "ef8aaff1-03b0-4544-89c9-035c25f01e5c", "bridge": "br-int", "label": "tempest-network-smoke--1887948682", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "92b402c8d3e2476abc98be42a1e6d34e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap887ea51a-ea", "ovs_interfaceid": "887ea51a-eae0-4aca-80e0-9a3c0d8b60b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.814 229250 DEBUG oslo_concurrency.lockutils [req-6e06c379-efe2-40d2-a78c-761adddc0b19 req-e95df7f8-4f35-4707-8b1c-95dec52e9fcc d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Releasing lock "refresh_cache-8ec1d789-8c20-4dd3-a44b-5565d1293cea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.936 229250 DEBUG nova.compute.manager [req-b8d39c57-2e59-4792-af58-bd87f06aa706 req-d4325356-ab20-42da-aa3a-72114560d71a d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received event network-vif-unplugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.936 229250 DEBUG oslo_concurrency.lockutils [req-b8d39c57-2e59-4792-af58-bd87f06aa706 req-d4325356-ab20-42da-aa3a-72114560d71a d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.936 229250 DEBUG oslo_concurrency.lockutils [req-b8d39c57-2e59-4792-af58-bd87f06aa706 req-d4325356-ab20-42da-aa3a-72114560d71a d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.937 229250 DEBUG oslo_concurrency.lockutils [req-b8d39c57-2e59-4792-af58-bd87f06aa706 req-d4325356-ab20-42da-aa3a-72114560d71a d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.937 229250 DEBUG nova.compute.manager [req-b8d39c57-2e59-4792-af58-bd87f06aa706 req-d4325356-ab20-42da-aa3a-72114560d71a d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] No waiting events found dispatching network-vif-unplugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.937 229250 WARNING nova.compute.manager [req-b8d39c57-2e59-4792-af58-bd87f06aa706 req-d4325356-ab20-42da-aa3a-72114560d71a d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received unexpected event network-vif-unplugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 for instance with vm_state deleted and task_state None.#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.937 229250 DEBUG nova.compute.manager [req-b8d39c57-2e59-4792-af58-bd87f06aa706 req-d4325356-ab20-42da-aa3a-72114560d71a d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received event network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.938 229250 DEBUG oslo_concurrency.lockutils [req-b8d39c57-2e59-4792-af58-bd87f06aa706 req-d4325356-ab20-42da-aa3a-72114560d71a d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Acquiring lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.938 229250 DEBUG oslo_concurrency.lockutils [req-b8d39c57-2e59-4792-af58-bd87f06aa706 req-d4325356-ab20-42da-aa3a-72114560d71a d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.938 229250 DEBUG oslo_concurrency.lockutils [req-b8d39c57-2e59-4792-af58-bd87f06aa706 req-d4325356-ab20-42da-aa3a-72114560d71a d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] Lock "8ec1d789-8c20-4dd3-a44b-5565d1293cea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.938 229250 DEBUG nova.compute.manager [req-b8d39c57-2e59-4792-af58-bd87f06aa706 req-d4325356-ab20-42da-aa3a-72114560d71a d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] No waiting events found dispatching network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  6 05:14:58 np0005548918 nova_compute[229246]: 2025-12-06 10:14:58.938 229250 WARNING nova.compute.manager [req-b8d39c57-2e59-4792-af58-bd87f06aa706 req-d4325356-ab20-42da-aa3a-72114560d71a d115944fbcd7470eae10054ca89c839d a4625a082db94534a44dd9543f68be02 - - default default] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Received unexpected event network-vif-plugged-887ea51a-eae0-4aca-80e0-9a3c0d8b60b7 for instance with vm_state deleted and task_state None.#033[00m
Dec  6 05:14:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:14:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:59.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:14:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:14:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:14:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:14:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:59.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:14:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:59 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:14:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:14:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:59 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:14:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:14:59 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:00 np0005548918 nova_compute[229246]: 2025-12-06 10:15:00.275 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:01.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:01 np0005548918 nova_compute[229246]: 2025-12-06 10:15:01.085 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:01.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:01 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:01 np0005548918 podman[241504]: 2025-12-06 10:15:01.690067469 +0000 UTC m=+0.082616161 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  6 05:15:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:01 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a900039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:01 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:01 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:15:01 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:15:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:03.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:03.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:03 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:03 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:03 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a900039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:15:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:05.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:15:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:05.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:05 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:05 np0005548918 nova_compute[229246]: 2025-12-06 10:15:05.279 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:05 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:05 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:06 np0005548918 nova_compute[229246]: 2025-12-06 10:15:06.088 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:07.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:07.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:07 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:07 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:07 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:09.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:09.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:09 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:09 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:09 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:10 np0005548918 nova_compute[229246]: 2025-12-06 10:15:10.330 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:11.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:11 np0005548918 nova_compute[229246]: 2025-12-06 10:15:11.050 229250 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765016096.049361, 8ec1d789-8c20-4dd3-a44b-5565d1293cea => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  6 05:15:11 np0005548918 nova_compute[229246]: 2025-12-06 10:15:11.051 229250 INFO nova.compute.manager [-] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] VM Stopped (Lifecycle Event)#033[00m
Dec  6 05:15:11 np0005548918 nova_compute[229246]: 2025-12-06 10:15:11.089 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:11.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:11 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a900042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:11 np0005548918 nova_compute[229246]: 2025-12-06 10:15:11.360 229250 DEBUG nova.compute.manager [None req-7012e5f7-4ccd-4c95-9527-ae322f42e639 - - - - - -] [instance: 8ec1d789-8c20-4dd3-a44b-5565d1293cea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  6 05:15:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:11 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a900042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:11 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:12 np0005548918 nova_compute[229246]: 2025-12-06 10:15:12.441 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:12 np0005548918 nova_compute[229246]: 2025-12-06 10:15:12.545 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:13.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:13.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:13 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:13 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:13 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:15:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:15.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:15:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:15.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:15 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:15 np0005548918 nova_compute[229246]: 2025-12-06 10:15:15.333 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:15 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:15 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a980016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:16 np0005548918 nova_compute[229246]: 2025-12-06 10:15:16.091 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:17.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:17.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:17 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:17 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:17 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:19.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:19.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:19 np0005548918 podman[241572]: 2025-12-06 10:15:19.226959938 +0000 UTC m=+0.118398720 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:15:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:19 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a980016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:19 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:19 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:20 np0005548918 nova_compute[229246]: 2025-12-06 10:15:20.335 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:15:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:21.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:15:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:21 np0005548918 nova_compute[229246]: 2025-12-06 10:15:21.093 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:21.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:21 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:21 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a980016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:21 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:23.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:23.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:23 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:23 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:23 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:25.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:25.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:25 np0005548918 podman[241606]: 2025-12-06 10:15:25.162121094 +0000 UTC m=+0.050182994 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:15:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:25 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:25 np0005548918 nova_compute[229246]: 2025-12-06 10:15:25.337 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:25 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:25 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:26 np0005548918 nova_compute[229246]: 2025-12-06 10:15:26.096 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:15:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:27.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:15:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:15:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:27.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:15:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:27 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:27 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:27 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:15:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:29.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:15:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:15:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:29.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:15:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:29 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:29 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:29 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:30 np0005548918 nova_compute[229246]: 2025-12-06 10:15:30.339 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:31.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:31 np0005548918 nova_compute[229246]: 2025-12-06 10:15:31.097 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:15:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:31.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:15:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:31 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:31 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:31 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:32 np0005548918 podman[241634]: 2025-12-06 10:15:32.195273437 +0000 UTC m=+0.082962071 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  6 05:15:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:33.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:33.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:33 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:33 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:33 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:35.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:35.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:35 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:35 np0005548918 nova_compute[229246]: 2025-12-06 10:15:35.340 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:35 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:35 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:36 np0005548918 nova_compute[229246]: 2025-12-06 10:15:36.099 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:37.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:37.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:37 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:37 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:37 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:39.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:39.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:39 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:39 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:39 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:40 np0005548918 nova_compute[229246]: 2025-12-06 10:15:40.342 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:40 np0005548918 nova_compute[229246]: 2025-12-06 10:15:40.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:40 np0005548918 nova_compute[229246]: 2025-12-06 10:15:40.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  6 05:15:40 np0005548918 nova_compute[229246]: 2025-12-06 10:15:40.592 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  6 05:15:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:41.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:41 np0005548918 nova_compute[229246]: 2025-12-06 10:15:41.101 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:41.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:41 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:41 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:41 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:43.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:15:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:43.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:15:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:43 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:43 np0005548918 nova_compute[229246]: 2025-12-06 10:15:43.592 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:43 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:43 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac80096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:45.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:45.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:45 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:45 np0005548918 nova_compute[229246]: 2025-12-06 10:15:45.344 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:45 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:45 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  6 05:15:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3121417538' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  6 05:15:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  6 05:15:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3121417538' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  6 05:15:46 np0005548918 nova_compute[229246]: 2025-12-06 10:15:46.103 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:47.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:47.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:47 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:47 np0005548918 nova_compute[229246]: 2025-12-06 10:15:47.534 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:47 np0005548918 nova_compute[229246]: 2025-12-06 10:15:47.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:47 np0005548918 nova_compute[229246]: 2025-12-06 10:15:47.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:47 np0005548918 nova_compute[229246]: 2025-12-06 10:15:47.535 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:15:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:47 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:47 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:48 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:15:48 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3505482668' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:15:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:49.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:49.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:49 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:49 np0005548918 ovn_controller[132371]: 2025-12-06T10:15:49Z|00080|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Dec  6 05:15:49 np0005548918 nova_compute[229246]: 2025-12-06 10:15:49.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:49 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:49 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:50 np0005548918 podman[241702]: 2025-12-06 10:15:50.192166635 +0000 UTC m=+0.082554309 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  6 05:15:50 np0005548918 nova_compute[229246]: 2025-12-06 10:15:50.347 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:51.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:51 np0005548918 nova_compute[229246]: 2025-12-06 10:15:51.105 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:51.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:51 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:51 np0005548918 nova_compute[229246]: 2025-12-06 10:15:51.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:51 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:51 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac8008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:53.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:53.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:53 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:53 np0005548918 nova_compute[229246]: 2025-12-06 10:15:53.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:15:53.683 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:15:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:15:53.684 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:15:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:15:53.684 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:15:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:53 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:53 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1abc003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:54 np0005548918 nova_compute[229246]: 2025-12-06 10:15:54.540 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:54 np0005548918 nova_compute[229246]: 2025-12-06 10:15:54.540 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:54 np0005548918 nova_compute[229246]: 2025-12-06 10:15:54.541 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:15:54 np0005548918 nova_compute[229246]: 2025-12-06 10:15:54.541 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:15:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:54 np0005548918 nova_compute[229246]: 2025-12-06 10:15:54.728 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:15:54 np0005548918 nova_compute[229246]: 2025-12-06 10:15:54.729 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:54 np0005548918 nova_compute[229246]: 2025-12-06 10:15:54.761 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:15:54 np0005548918 nova_compute[229246]: 2025-12-06 10:15:54.761 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:15:54 np0005548918 nova_compute[229246]: 2025-12-06 10:15:54.762 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:15:54 np0005548918 nova_compute[229246]: 2025-12-06 10:15:54.762 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:15:54 np0005548918 nova_compute[229246]: 2025-12-06 10:15:54.762 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:15:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:55.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:15:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:55.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:15:55 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3624299820' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:15:55 np0005548918 nova_compute[229246]: 2025-12-06 10:15:55.210 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:15:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:55 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac80011e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:55 np0005548918 podman[241783]: 2025-12-06 10:15:55.310602908 +0000 UTC m=+0.059662347 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  6 05:15:55 np0005548918 nova_compute[229246]: 2025-12-06 10:15:55.348 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:55 np0005548918 nova_compute[229246]: 2025-12-06 10:15:55.428 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:15:55 np0005548918 nova_compute[229246]: 2025-12-06 10:15:55.431 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4847MB free_disk=59.94289016723633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:15:55 np0005548918 nova_compute[229246]: 2025-12-06 10:15:55.431 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:15:55 np0005548918 nova_compute[229246]: 2025-12-06 10:15:55.432 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:15:55 np0005548918 nova_compute[229246]: 2025-12-06 10:15:55.549 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:15:55 np0005548918 nova_compute[229246]: 2025-12-06 10:15:55.549 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:15:55 np0005548918 nova_compute[229246]: 2025-12-06 10:15:55.678 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:15:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:55 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac80011e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:55 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:56 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:15:56 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1552888335' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:15:56 np0005548918 nova_compute[229246]: 2025-12-06 10:15:56.107 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:15:56 np0005548918 nova_compute[229246]: 2025-12-06 10:15:56.111 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:15:56 np0005548918 nova_compute[229246]: 2025-12-06 10:15:56.116 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:15:56 np0005548918 nova_compute[229246]: 2025-12-06 10:15:56.130 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:15:56 np0005548918 nova_compute[229246]: 2025-12-06 10:15:56.157 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:15:56 np0005548918 nova_compute[229246]: 2025-12-06 10:15:56.158 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.726s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:15:56 np0005548918 nova_compute[229246]: 2025-12-06 10:15:56.158 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:15:56 np0005548918 nova_compute[229246]: 2025-12-06 10:15:56.159 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  6 05:15:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:57.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:15:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:57.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:15:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:57 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:57 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac80011e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:57 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c003750 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:59.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:15:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:15:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:59.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:15:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:59 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:15:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:15:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:15:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:59 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:15:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:15:59 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac800a6c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:00 np0005548918 nova_compute[229246]: 2025-12-06 10:16:00.350 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:16:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:01.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:16:01 np0005548918 nova_compute[229246]: 2025-12-06 10:16:01.109 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:01.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:01 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c004350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:01 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:16:01.410 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:16:01 np0005548918 nova_compute[229246]: 2025-12-06 10:16:01.410 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:01 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:16:01.412 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:16:01 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:16:01.412 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:16:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:01 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:01 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:16:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:03.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:16:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:03.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:03 np0005548918 podman[241912]: 2025-12-06 10:16:03.179485371 +0000 UTC m=+0.066192671 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec  6 05:16:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:03 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:03 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c004350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:03 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:05.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:05.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:05 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:05 np0005548918 nova_compute[229246]: 2025-12-06 10:16:05.352 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:05 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac800a6c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:05 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c004350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:16:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:16:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:16:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:16:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:16:06 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:16:06 np0005548918 nova_compute[229246]: 2025-12-06 10:16:06.112 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:07 np0005548918 ceph-mon[75798]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s))
Dec  6 05:16:07 np0005548918 ceph-mon[75798]: Cluster is now healthy
Dec  6 05:16:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:07.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:07.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:07 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:07 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:07 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac800a6c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:09.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:09.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:09 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a9c004350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:09 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1a98003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:09 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aa8004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:09 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:16:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:10 np0005548918 nova_compute[229246]: 2025-12-06 10:16:10.355 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:11 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:16:11 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:16:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:11.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:11 np0005548918 nova_compute[229246]: 2025-12-06 10:16:11.114 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:11.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:11 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac800a6c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  6 05:16:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:11 np0005548918 kernel: ganesha.nfsd[241694]: segfault at 50 ip 00007f1b72fab32e sp 00007f1b34ff8210 error 4 in libntirpc.so.5.8[7f1b72f90000+2c000] likely on CPU 1 (core 0, socket 1)
Dec  6 05:16:11 np0005548918 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  6 05:16:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb[239582]: 06/12/2025 10:16:11 : epoch 693401d3 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1ac800a6c0 fd 38 proxy ignored for local
Dec  6 05:16:11 np0005548918 systemd[1]: Started Process Core Dump (PID 241966/UID 0).
Dec  6 05:16:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:13 np0005548918 systemd-coredump[241967]: Process 239586 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 64:#012#0  0x00007f1b72fab32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  6 05:16:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:13.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:13 np0005548918 systemd[1]: systemd-coredump@9-241966-0.service: Deactivated successfully.
Dec  6 05:16:13 np0005548918 systemd[1]: systemd-coredump@9-241966-0.service: Consumed 1.101s CPU time.
Dec  6 05:16:13 np0005548918 podman[241973]: 2025-12-06 10:16:13.165421806 +0000 UTC m=+0.037436943 container died e0ec4d7cccdd0c825989c42c82ae89eddf385a57b9308109ee802c61948bb4ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  6 05:16:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:16:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:13.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:16:13 np0005548918 systemd[1]: var-lib-containers-storage-overlay-2b7ca7e97f3abb57b185ebb8059b67b96028c2cdfdce1eef328b75ca6d06a511-merged.mount: Deactivated successfully.
Dec  6 05:16:13 np0005548918 podman[241973]: 2025-12-06 10:16:13.212338572 +0000 UTC m=+0.084353679 container remove e0ec4d7cccdd0c825989c42c82ae89eddf385a57b9308109ee802c61948bb4ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-1-0-compute-2-sseuqb, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  6 05:16:13 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Main process exited, code=exited, status=139/n/a
Dec  6 05:16:13 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Failed with result 'exit-code'.
Dec  6 05:16:13 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.772s CPU time.
Dec  6 05:16:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:16:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:15.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:16:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:16:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:15.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:16:15 np0005548918 nova_compute[229246]: 2025-12-06 10:16:15.356 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:16 np0005548918 nova_compute[229246]: 2025-12-06 10:16:16.116 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:16:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:17.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:16:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:17.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/101617 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:16:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:19.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:19.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:20 np0005548918 nova_compute[229246]: 2025-12-06 10:16:20.359 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:21.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:21 np0005548918 nova_compute[229246]: 2025-12-06 10:16:21.159 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:21.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:21 np0005548918 podman[242052]: 2025-12-06 10:16:21.24323312 +0000 UTC m=+0.128656353 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  6 05:16:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:23.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:23.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:23 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Scheduled restart job, restart counter is at 10.
Dec  6 05:16:23 np0005548918 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:16:23 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Consumed 1.772s CPU time.
Dec  6 05:16:23 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Start request repeated too quickly.
Dec  6 05:16:23 np0005548918 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.1.0.compute-2.sseuqb.service: Failed with result 'exit-code'.
Dec  6 05:16:23 np0005548918 systemd[1]: Failed to start Ceph nfs.cephfs.1.0.compute-2.sseuqb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec  6 05:16:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:25.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:25.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:25 np0005548918 nova_compute[229246]: 2025-12-06 10:16:25.362 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:26 np0005548918 nova_compute[229246]: 2025-12-06 10:16:26.161 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:26 np0005548918 podman[242082]: 2025-12-06 10:16:26.169841393 +0000 UTC m=+0.057539261 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  6 05:16:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:27.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:16:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:27.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:16:27 np0005548918 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec  6 05:16:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:29.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:16:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:29.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:16:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:30 np0005548918 nova_compute[229246]: 2025-12-06 10:16:30.363 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:31.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:31 np0005548918 nova_compute[229246]: 2025-12-06 10:16:31.164 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:16:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:31.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:16:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:33.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:33.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:34 np0005548918 podman[242110]: 2025-12-06 10:16:34.178097575 +0000 UTC m=+0.064170988 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:16:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:35.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:35.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:35 np0005548918 nova_compute[229246]: 2025-12-06 10:16:35.366 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:36 np0005548918 nova_compute[229246]: 2025-12-06 10:16:36.166 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:37.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:37.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:39.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:39.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:40 np0005548918 nova_compute[229246]: 2025-12-06 10:16:40.367 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/101640 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:16:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:41.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:41 np0005548918 nova_compute[229246]: 2025-12-06 10:16:41.168 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:16:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:41.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:16:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:43.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:43.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:45.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:45.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:45 np0005548918 nova_compute[229246]: 2025-12-06 10:16:45.369 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:45 np0005548918 nova_compute[229246]: 2025-12-06 10:16:45.981 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:46 np0005548918 nova_compute[229246]: 2025-12-06 10:16:46.170 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:16:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:47.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:16:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:47.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:47 np0005548918 nova_compute[229246]: 2025-12-06 10:16:47.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:48 np0005548918 nova_compute[229246]: 2025-12-06 10:16:48.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:48 np0005548918 nova_compute[229246]: 2025-12-06 10:16:48.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:16:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:16:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:49.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:16:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:49.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:49 np0005548918 nova_compute[229246]: 2025-12-06 10:16:49.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:49 np0005548918 nova_compute[229246]: 2025-12-06 10:16:49.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:50 np0005548918 nova_compute[229246]: 2025-12-06 10:16:50.371 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:51.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:51 np0005548918 nova_compute[229246]: 2025-12-06 10:16:51.172 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:51.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:51 np0005548918 nova_compute[229246]: 2025-12-06 10:16:51.530 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:51 np0005548918 nova_compute[229246]: 2025-12-06 10:16:51.551 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:52 np0005548918 podman[242173]: 2025-12-06 10:16:52.220366657 +0000 UTC m=+0.100080029 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller)
Dec  6 05:16:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:53.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:16:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:53.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:16:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:16:53.685 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:16:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:16:53.685 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:16:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:16:53.685 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:16:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:54 np0005548918 nova_compute[229246]: 2025-12-06 10:16:54.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:54 np0005548918 nova_compute[229246]: 2025-12-06 10:16:54.571 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:16:54 np0005548918 nova_compute[229246]: 2025-12-06 10:16:54.572 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:16:54 np0005548918 nova_compute[229246]: 2025-12-06 10:16:54.572 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:16:54 np0005548918 nova_compute[229246]: 2025-12-06 10:16:54.572 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:16:54 np0005548918 nova_compute[229246]: 2025-12-06 10:16:54.573 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:16:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:16:55 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1188976781' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:16:55 np0005548918 nova_compute[229246]: 2025-12-06 10:16:55.089 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:16:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:55.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:55.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:55 np0005548918 nova_compute[229246]: 2025-12-06 10:16:55.268 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:16:55 np0005548918 nova_compute[229246]: 2025-12-06 10:16:55.269 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4877MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:16:55 np0005548918 nova_compute[229246]: 2025-12-06 10:16:55.269 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:16:55 np0005548918 nova_compute[229246]: 2025-12-06 10:16:55.269 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:16:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:16:55 np0005548918 nova_compute[229246]: 2025-12-06 10:16:55.351 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:16:55 np0005548918 nova_compute[229246]: 2025-12-06 10:16:55.351 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:16:55 np0005548918 nova_compute[229246]: 2025-12-06 10:16:55.374 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:55 np0005548918 nova_compute[229246]: 2025-12-06 10:16:55.593 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing inventories for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  6 05:16:55 np0005548918 nova_compute[229246]: 2025-12-06 10:16:55.609 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating ProviderTree inventory for provider 31f5f484-bf36-44de-83b8-7b434061a77b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  6 05:16:55 np0005548918 nova_compute[229246]: 2025-12-06 10:16:55.609 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating inventory in ProviderTree for provider 31f5f484-bf36-44de-83b8-7b434061a77b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  6 05:16:55 np0005548918 nova_compute[229246]: 2025-12-06 10:16:55.623 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing aggregate associations for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  6 05:16:55 np0005548918 nova_compute[229246]: 2025-12-06 10:16:55.643 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing trait associations for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE4A,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_ABM,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,HW_CPU_X86_AVX2,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  6 05:16:55 np0005548918 nova_compute[229246]: 2025-12-06 10:16:55.668 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:16:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:56 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:16:56 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3022049998' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:16:56 np0005548918 nova_compute[229246]: 2025-12-06 10:16:56.125 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:16:56 np0005548918 nova_compute[229246]: 2025-12-06 10:16:56.132 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:16:56 np0005548918 nova_compute[229246]: 2025-12-06 10:16:56.150 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:16:56 np0005548918 nova_compute[229246]: 2025-12-06 10:16:56.153 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:16:56 np0005548918 nova_compute[229246]: 2025-12-06 10:16:56.153 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.884s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:16:56 np0005548918 nova_compute[229246]: 2025-12-06 10:16:56.175 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:16:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:16:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:57.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:16:57 np0005548918 nova_compute[229246]: 2025-12-06 10:16:57.149 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:57 np0005548918 nova_compute[229246]: 2025-12-06 10:16:57.149 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:16:57 np0005548918 nova_compute[229246]: 2025-12-06 10:16:57.150 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:16:57 np0005548918 nova_compute[229246]: 2025-12-06 10:16:57.150 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:16:57 np0005548918 nova_compute[229246]: 2025-12-06 10:16:57.174 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:16:57 np0005548918 podman[242273]: 2025-12-06 10:16:57.202846833 +0000 UTC m=+0.075373478 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec  6 05:16:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:57.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:59.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:16:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:16:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:59.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:16:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:16:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:16:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:16:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:00 np0005548918 nova_compute[229246]: 2025-12-06 10:17:00.374 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:01.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:01 np0005548918 nova_compute[229246]: 2025-12-06 10:17:01.186 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:01.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:03.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:03.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:05.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:05 np0005548918 podman[242300]: 2025-12-06 10:17:05.205002714 +0000 UTC m=+0.076825846 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  6 05:17:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:05.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:05 np0005548918 nova_compute[229246]: 2025-12-06 10:17:05.379 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [WARNING] 339/101705 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  6 05:17:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [NOTICE] 339/101705 (4) : haproxy version is 2.3.17-d1c9119
Dec  6 05:17:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [NOTICE] 339/101705 (4) : path to executable is /usr/local/sbin/haproxy
Dec  6 05:17:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna[85078]: [ALERT] 339/101705 (4) : backend 'backend' has no server available!
Dec  6 05:17:06 np0005548918 nova_compute[229246]: 2025-12-06 10:17:06.188 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:07.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:07.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:09.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:17:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:09.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:17:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:10 np0005548918 nova_compute[229246]: 2025-12-06 10:17:10.382 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:11.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:11 np0005548918 nova_compute[229246]: 2025-12-06 10:17:11.190 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:11.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:12 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:17:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:12 np0005548918 systemd-logind[800]: New session 55 of user zuul.
Dec  6 05:17:12 np0005548918 systemd[1]: Started Session 55 of User zuul.
Dec  6 05:17:13 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:17:13 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:17:13 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:17:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:13.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:13.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:15.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:15.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:15 np0005548918 nova_compute[229246]: 2025-12-06 10:17:15.383 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:16 np0005548918 nova_compute[229246]: 2025-12-06 10:17:16.230 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:16 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec  6 05:17:16 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1630414090' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  6 05:17:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:17.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:17:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:17.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:17:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:18 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:17:18 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:17:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:19.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:17:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:19.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:17:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:20 np0005548918 nova_compute[229246]: 2025-12-06 10:17:20.386 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:21.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:21 np0005548918 nova_compute[229246]: 2025-12-06 10:17:21.232 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:21.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:22 np0005548918 ovs-vsctl[242823]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  6 05:17:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:23 np0005548918 virtqemud[228866]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  6 05:17:23 np0005548918 virtqemud[228866]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  6 05:17:23 np0005548918 virtqemud[228866]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  6 05:17:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:17:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:23.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:17:23 np0005548918 podman[242963]: 2025-12-06 10:17:23.238903994 +0000 UTC m=+0.121038350 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  6 05:17:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:23.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:23 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: cache status {prefix=cache status} (starting...)
Dec  6 05:17:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:23 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: client ls {prefix=client ls} (starting...)
Dec  6 05:17:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:23 np0005548918 lvm[243213]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 05:17:23 np0005548918 lvm[243213]: VG ceph_vg0 finished
Dec  6 05:17:24 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: damage ls {prefix=damage ls} (starting...)
Dec  6 05:17:24 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: dump loads {prefix=dump loads} (starting...)
Dec  6 05:17:24 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Dec  6 05:17:24 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1668178014' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec  6 05:17:24 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec  6 05:17:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:24 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec  6 05:17:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:24 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec  6 05:17:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec  6 05:17:25 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/644308478' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  6 05:17:25 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec  6 05:17:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:25.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:25.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:25 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec  6 05:17:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Dec  6 05:17:25 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3220109428' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec  6 05:17:25 np0005548918 nova_compute[229246]: 2025-12-06 10:17:25.387 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:25 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec  6 05:17:25 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: ops {prefix=ops} (starting...)
Dec  6 05:17:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec  6 05:17:25 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3753402528' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec  6 05:17:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec  6 05:17:25 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3340143316' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec  6 05:17:26 np0005548918 nova_compute[229246]: 2025-12-06 10:17:26.234 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:26 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  6 05:17:26 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/741915433' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  6 05:17:26 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: session ls {prefix=session ls} (starting...)
Dec  6 05:17:26 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: status {prefix=status} (starting...)
Dec  6 05:17:26 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  6 05:17:26 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4128122182' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  6 05:17:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  6 05:17:27 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2688023797' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  6 05:17:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Dec  6 05:17:27 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3176248467' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec  6 05:17:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:27.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:27.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  6 05:17:27 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3246663567' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  6 05:17:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec  6 05:17:27 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3581068770' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec  6 05:17:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec  6 05:17:27 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2175235751' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec  6 05:17:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec  6 05:17:27 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2898722618' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec  6 05:17:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  6 05:17:27 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4240707576' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  6 05:17:27 np0005548918 podman[243758]: 2025-12-06 10:17:27.947739099 +0000 UTC m=+0.058518896 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  6 05:17:28 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  6 05:17:28 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3919289705' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  6 05:17:28 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec  6 05:17:28 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2579086106' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec  6 05:17:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:28 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec  6 05:17:28 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/98512507' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec  6 05:17:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:29.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:17:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:29.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:17:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  6 05:17:29 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2166791389' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  6 05:17:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  6 05:17:29 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2906548466' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  6 05:17:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70713344 unmapped: 1515520 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6b000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 1507328 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 1499136 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843773 data_alloc: 218103808 data_used: 45056
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 1499136 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 1490944 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6b000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 1490944 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 1482752 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 1482752 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843773 data_alloc: 218103808 data_used: 45056
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 1482752 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70762496 unmapped: 1466368 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70762496 unmapped: 1466368 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6b000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 1449984 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 40.879638672s of 41.003166199s, submitted: 44
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 1449984 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846109 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70787072 unmapped: 1441792 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 1433600 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 1433600 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c6bfbc00 session 0x55f8c7e0f860
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 1433600 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 1433600 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846109 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 1417216 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 1417216 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 1409024 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 1409024 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 1400832 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845518 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 1400832 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70836224 unmapped: 1392640 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70844416 unmapped: 1384448 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70852608 unmapped: 1376256 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70877184 unmapped: 1351680 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845518 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70877184 unmapped: 1351680 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70885376 unmapped: 1343488 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 1335296 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.492652893s of 18.503259659s, submitted: 3
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 1335296 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 1335296 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847030 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70901760 unmapped: 1327104 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70901760 unmapped: 1327104 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 1318912 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 1318912 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70926336 unmapped: 1302528 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847951 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70934528 unmapped: 1294336 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70942720 unmapped: 1286144 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70942720 unmapped: 1286144 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.018528938s of 10.460161209s, submitted: 4
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70950912 unmapped: 1277952 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70959104 unmapped: 1269760 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70959104 unmapped: 1269760 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70967296 unmapped: 1261568 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70967296 unmapped: 1261568 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70975488 unmapped: 1253376 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70991872 unmapped: 1236992 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70991872 unmapped: 1236992 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 70991872 unmapped: 1236992 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71008256 unmapped: 1220608 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71008256 unmapped: 1220608 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1204224 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1204224 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1204224 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1196032 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1196032 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71049216 unmapped: 1179648 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71049216 unmapped: 1179648 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71049216 unmapped: 1179648 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71057408 unmapped: 1171456 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71057408 unmapped: 1171456 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71057408 unmapped: 1171456 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71057408 unmapped: 1171456 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71065600 unmapped: 1163264 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71065600 unmapped: 1163264 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71073792 unmapped: 1155072 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 1138688 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 1138688 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 1130496 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 1130496 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 1122304 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 1105920 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 1105920 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1097728 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1097728 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1097728 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 1089536 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 1089536 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1081344 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1081344 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1081344 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c7f1a800 session 0x55f8c8231860
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1064960 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1064960 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1064960 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 1056768 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 1056768 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 1048576 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 1040384 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 1040384 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1032192 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1032192 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1015808 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 999424 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 999424 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 999424 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 991232 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 983040 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 983040 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 974848 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c564b800 session 0x55f8c7f810e0
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 974848 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 966656 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 950272 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 950272 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 950272 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 950272 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 942080 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 909312 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 909312 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 901120 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 901120 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 901120 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 892928 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847360 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 892928 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 73.040710449s of 73.040710449s, submitted: 0
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 884736 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 876544 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 876544 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 827392 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849202 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 827392 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 827392 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 819200 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 819200 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 786432 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849202 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 786432 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 778240 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 778240 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 778240 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 770048 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849202 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 770048 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71467008 unmapped: 761856 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71475200 unmapped: 753664 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71475200 unmapped: 753664 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c619c3c0
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71475200 unmapped: 753664 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849202 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 745472 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 745472 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 745472 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 737280 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 712704 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849202 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 712704 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 704512 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 704512 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 696320 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 679936 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849202 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 679936 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 671744 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 671744 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 671744 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 671744 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849202 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 663552 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 655360 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 647168 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 647168 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 630784 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 39.195549011s of 39.732166290s, submitted: 4
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848611 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 630784 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 630784 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 622592 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 622592 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 614400 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 606208 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 606208 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 598016 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 598016 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 581632 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 581632 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 581632 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 573440 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 573440 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 573440 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 565248 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71671808 unmapped: 557056 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71671808 unmapped: 557056 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 548864 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 540672 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 540672 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71696384 unmapped: 532480 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71696384 unmapped: 532480 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71696384 unmapped: 532480 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 516096 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 516096 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 507904 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 507904 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 507904 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71745536 unmapped: 483328 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71745536 unmapped: 483328 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 475136 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 466944 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 466944 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 450560 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 442368 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 442368 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 434176 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 434176 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 401408 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 401408 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 401408 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71835648 unmapped: 393216 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71835648 unmapped: 393216 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 385024 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 385024 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 385024 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71852032 unmapped: 376832 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71852032 unmapped: 376832 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71860224 unmapped: 368640 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 360448 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 360448 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 352256 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 352256 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71884800 unmapped: 344064 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 335872 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 335872 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 335872 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 327680 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71942144 unmapped: 286720 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71942144 unmapped: 286720 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71950336 unmapped: 278528 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71950336 unmapped: 278528 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 270336 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 270336 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 270336 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 262144 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 262144 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71974912 unmapped: 253952 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 245760 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 245760 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 245760 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71991296 unmapped: 237568 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71991296 unmapped: 237568 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 71999488 unmapped: 229376 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72007680 unmapped: 221184 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72007680 unmapped: 221184 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72015872 unmapped: 212992 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72015872 unmapped: 212992 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72048640 unmapped: 180224 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72048640 unmapped: 180224 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72048640 unmapped: 180224 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72056832 unmapped: 172032 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 163840 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 163840 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c74710e0
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72073216 unmapped: 155648 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72073216 unmapped: 155648 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72073216 unmapped: 155648 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72081408 unmapped: 147456 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72081408 unmapped: 147456 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 139264 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 139264 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 139264 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 131072 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 122880 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848020 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72114176 unmapped: 114688 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 106496 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 106496 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 98.311981201s of 98.413398743s, submitted: 2
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72130560 unmapped: 98304 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 81920 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849532 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 73728 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 73728 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 73728 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 65536 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 65536 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851044 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 65536 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72171520 unmapped: 57344 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72171520 unmapped: 57344 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72171520 unmapped: 57344 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 49152 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851044 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 49152 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 40960 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 40960 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 40960 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 24576 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851044 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 24576 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 24576 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 24576 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72220672 unmapped: 8192 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72220672 unmapped: 8192 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851044 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72220672 unmapped: 8192 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72228864 unmapped: 0 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72228864 unmapped: 0 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72237056 unmapped: 1040384 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72237056 unmapped: 1040384 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851044 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851044 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72269824 unmapped: 1007616 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72269824 unmapped: 1007616 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72269824 unmapped: 1007616 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72278016 unmapped: 999424 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72286208 unmapped: 991232 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851044 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72286208 unmapped: 991232 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72286208 unmapped: 991232 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72294400 unmapped: 983040 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72294400 unmapped: 983040 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72302592 unmapped: 974848 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851044 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72302592 unmapped: 974848 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72302592 unmapped: 974848 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72310784 unmapped: 966656 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72318976 unmapped: 958464 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c564b800 session 0x55f8c731ed20
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72327168 unmapped: 950272 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851044 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72327168 unmapped: 950272 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72327168 unmapped: 950272 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72335360 unmapped: 942080 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72327168 unmapped: 950272 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72327168 unmapped: 950272 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851044 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72335360 unmapped: 942080 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72335360 unmapped: 942080 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72335360 unmapped: 942080 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72343552 unmapped: 933888 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72343552 unmapped: 933888 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851044 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72351744 unmapped: 925696 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72351744 unmapped: 925696 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72359936 unmapped: 917504 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 59.355976105s of 59.403640747s, submitted: 2
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 901120 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 901120 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852556 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72384512 unmapped: 892928 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72400896 unmapped: 876544 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72400896 unmapped: 876544 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 843776 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 843776 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853477 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 835584 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 835584 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 835584 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72474624 unmapped: 802816 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72474624 unmapped: 802816 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 794624 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 794624 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72491008 unmapped: 786432 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72491008 unmapped: 786432 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72491008 unmapped: 786432 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72491008 unmapped: 786432 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 778240 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 778240 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 778240 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 770048 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 770048 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 770048 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72515584 unmapped: 761856 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72515584 unmapped: 761856 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72515584 unmapped: 761856 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 753664 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 753664 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72531968 unmapped: 745472 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72548352 unmapped: 729088 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72548352 unmapped: 729088 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 720896 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 720896 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 720896 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5570 writes, 24K keys, 5570 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5570 writes, 875 syncs, 6.37 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5570 writes, 24K keys, 5570 commit groups, 1.0 writes per commit group, ingest: 19.09 MB, 0.03 MB/s#012Interval WAL: 5570 writes, 875 syncs, 6.37 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f8c486a9b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f8c486a9b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72622080 unmapped: 655360 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72622080 unmapped: 655360 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72630272 unmapped: 647168 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72630272 unmapped: 647168 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 638976 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 638976 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 638976 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72646656 unmapped: 630784 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72646656 unmapped: 630784 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72654848 unmapped: 622592 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72654848 unmapped: 622592 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72654848 unmapped: 622592 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 614400 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 614400 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72671232 unmapped: 606208 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72671232 unmapped: 606208 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72671232 unmapped: 606208 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72671232 unmapped: 606208 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 598016 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 598016 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72687616 unmapped: 589824 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 581632 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 581632 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72704000 unmapped: 573440 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72704000 unmapped: 573440 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 565248 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72720384 unmapped: 557056 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72720384 unmapped: 557056 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 548864 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 548864 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 548864 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 540672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 540672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 540672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72744960 unmapped: 532480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 524288 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 524288 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 524288 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 516096 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 516096 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72769536 unmapped: 507904 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72769536 unmapped: 507904 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72769536 unmapped: 507904 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72777728 unmapped: 499712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72777728 unmapped: 499712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72777728 unmapped: 499712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72785920 unmapped: 491520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72785920 unmapped: 491520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72785920 unmapped: 491520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 483328 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 483328 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 475136 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 475136 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72810496 unmapped: 466944 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72810496 unmapped: 466944 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 450560 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 450560 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 92.979804993s of 93.338882446s, submitted: 4
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 516096 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,1])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 1523712 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 1523712 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c82510e0
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73916416 unmapped: 409600 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73932800 unmapped: 393216 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73932800 unmapped: 393216 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73932800 unmapped: 393216 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73932800 unmapped: 393216 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73932800 unmapped: 393216 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 385024 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 385024 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 385024 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 385024 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 385024 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 385024 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852886 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 385024 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 376832 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.892656326s of 17.513534546s, submitted: 215
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 352256 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 352256 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 352256 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854398 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73981952 unmapped: 344064 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73981952 unmapped: 344064 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 335872 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 335872 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 335872 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853807 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 327680 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 319488 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74014720 unmapped: 311296 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74022912 unmapped: 303104 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74031104 unmapped: 294912 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853216 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74031104 unmapped: 294912 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74031104 unmapped: 294912 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74047488 unmapped: 278528 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74047488 unmapped: 278528 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74047488 unmapped: 278528 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853216 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74055680 unmapped: 270336 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74055680 unmapped: 270336 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74063872 unmapped: 262144 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74063872 unmapped: 262144 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74072064 unmapped: 253952 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853216 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74072064 unmapped: 253952 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74080256 unmapped: 245760 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74088448 unmapped: 237568 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74088448 unmapped: 237568 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74096640 unmapped: 229376 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853216 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74104832 unmapped: 221184 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74104832 unmapped: 221184 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74113024 unmapped: 212992 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74113024 unmapped: 212992 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74121216 unmapped: 204800 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853216 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74121216 unmapped: 204800 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74121216 unmapped: 204800 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74137600 unmapped: 188416 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853216 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74137600 unmapped: 188416 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74153984 unmapped: 172032 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853216 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74153984 unmapped: 172032 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74162176 unmapped: 163840 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74178560 unmapped: 147456 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853216 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74178560 unmapped: 147456 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853216 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853216 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853216 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c7f80f00
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74137600 unmapped: 188416 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74137600 unmapped: 188416 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74137600 unmapped: 188416 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853216 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74137600 unmapped: 188416 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74137600 unmapped: 188416 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853216 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853216 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 78.128250122s of 78.537132263s, submitted: 3
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74153984 unmapped: 172032 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 856240 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74153984 unmapped: 172032 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74153984 unmapped: 172032 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855649 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855649 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855649 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855649 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74178560 unmapped: 147456 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74178560 unmapped: 147456 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74178560 unmapped: 147456 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855649 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74178560 unmapped: 147456 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74178560 unmapped: 147456 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855649 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855649 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c6bfbc00 session 0x55f8c7de34a0
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855649 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855649 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855649 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855649 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74211328 unmapped: 114688 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 63.596630096s of 63.621204376s, submitted: 3
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74211328 unmapped: 114688 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855058 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74211328 unmapped: 114688 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74211328 unmapped: 114688 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74211328 unmapped: 114688 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74211328 unmapped: 114688 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74211328 unmapped: 114688 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854467 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74211328 unmapped: 114688 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74211328 unmapped: 114688 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74211328 unmapped: 114688 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854467 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854467 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c619cb40
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854467 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854467 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854467 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854467 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 37.452941895s of 37.460189819s, submitted: 2
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 32768 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 32768 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c7f1a800 session 0x55f8c7f99a40
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 58.469783783s of 58.474414825s, submitted: 1
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c83f7a40
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 857491 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74244096 unmapped: 81920 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74244096 unmapped: 81920 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74244096 unmapped: 81920 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74244096 unmapped: 81920 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 857491 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74244096 unmapped: 81920 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 857491 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 857491 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.103944778s of 17.108892441s, submitted: 1
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860515 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 32768 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 32768 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 32768 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 32768 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 32768 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 24576 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 24576 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 24576 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 24576 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 24576 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 24576 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 24576 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 1015808 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 1015808 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 983040 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 983040 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 983040 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 983040 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 983040 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 75.162078857s of 75.172508240s, submitted: 3
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862948 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74416128 unmapped: 958464 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74416128 unmapped: 958464 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74416128 unmapped: 958464 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74416128 unmapped: 958464 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74416128 unmapped: 958464 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74448896 unmapped: 925696 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74465280 unmapped: 909312 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74465280 unmapped: 909312 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74465280 unmapped: 909312 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74465280 unmapped: 909312 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74465280 unmapped: 909312 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74498048 unmapped: 876544 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74498048 unmapped: 876544 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74498048 unmapped: 876544 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8052780
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 131.492523193s of 131.637023926s, submitted: 4
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 794624 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 794624 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 794624 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863278 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread fragmentation_score=0.000025 took=0.000037s
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 6030 writes, 25K keys, 6030 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 6030 writes, 1100 syncs, 5.48 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 460 writes, 719 keys, 460 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s#012Interval WAL: 460 writes, 225 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f8c486a9b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f8c486a9b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 105.746887207s of 105.757682800s, submitted: 3
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863608 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 729088 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 729088 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 729088 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.876320839s of 14.883418083s, submitted: 2
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75735040 unmapped: 1736704 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 1654784 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75849728 unmapped: 1622016 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75849728 unmapped: 1622016 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8db10e0
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 66.274696350s of 66.843589783s, submitted: 197
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 handle_osd_map epochs [146,147], i have 146, src has [1,147]
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 96.137619019s of 96.141578674s, submitted: 1
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 146 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1581056 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 147 handle_osd_map epochs [147,148], i have 147, src has [1,148]
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 147 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 77037568 unmapped: 17219584 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 148 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 149 ms_handle_reset con 0x55f8c564b800 session 0x55f8c60061e0
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 77045760 unmapped: 17211392 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 77053952 unmapped: 17203200 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 77053952 unmapped: 17203200 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937851 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 150 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8ed2b40
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fc25c000/0x0/0x4ffc00000, data 0x905373/0x9be000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fc25a000/0x0/0x4ffc00000, data 0x90747b/0x9c1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc25a000/0x0/0x4ffc00000, data 0x90747b/0x9c1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941715 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941715 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941715 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c6ca1a40
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941715 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8f012c0
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941715 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941715 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 39.368801117s of 39.591522217s, submitted: 60
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943227 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78135296 unmapped: 16121856 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78135296 unmapped: 16121856 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78135296 unmapped: 16121856 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 16113664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943899 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc258000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 16113664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 16113664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 16113664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 16113664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc258000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 16113664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c8adc400 session 0x55f8c8f2c960
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943899 data_alloc: 218103808 data_used: 49152
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8f2cb40
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8f2cd20
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78135296 unmapped: 16121856 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.870017052s of 11.323535919s, submitted: 2
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8f2cf00
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8f2d860
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 8380416 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c8adc800 session 0x55f8c8f2cb40
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 85884928 unmapped: 8372224 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 152 handle_osd_map epochs [152,153], i have 152, src has [1,153]
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8f012c0
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c5eaf0e0
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8f0c960
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8f0cf00
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c8adcc00 session 0x55f8c8ef9680
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 89407488 unmapped: 15417344 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 153 heartbeat osd_stat(store_statfs(0x4fb480000/0x0/0x4ffc00000, data 0x16dd679/0x179a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8f0d4a0
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 89489408 unmapped: 15335424 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1083921 data_alloc: 218103808 data_used: 6873088
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 153 heartbeat osd_stat(store_statfs(0x4fb480000/0x0/0x4ffc00000, data 0x16dd679/0x179a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 89522176 unmapped: 15302656 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8f0d860
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 89522176 unmapped: 15302656 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8f0da40
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8f0de00
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 153 heartbeat osd_stat(store_statfs(0x4fb480000/0x0/0x4ffc00000, data 0x16dd679/0x179a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 89407488 unmapped: 15417344 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 89776128 unmapped: 15048704 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 153 handle_osd_map epochs [153,154], i have 153, src has [1,154]
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175100 data_alloc: 234881024 data_used: 17768448
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb47e000/0x0/0x4ffc00000, data 0x16df64b/0x179d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1a800 session 0x55f8c8f00780
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb47e000/0x0/0x4ffc00000, data 0x16df64b/0x179d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175100 data_alloc: 234881024 data_used: 17768448
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb47e000/0x0/0x4ffc00000, data 0x16df64b/0x179d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100466688 unmapped: 4358144 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.688327789s of 18.928897858s, submitted: 82
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193376 data_alloc: 234881024 data_used: 17817600
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110518272 unmapped: 3989504 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109641728 unmapped: 4866048 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109641728 unmapped: 4866048 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9137000/0x0/0x4ffc00000, data 0x288664b/0x2944000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109682688 unmapped: 4825088 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9137000/0x0/0x4ffc00000, data 0x288664b/0x2944000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109682688 unmapped: 4825088 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1327180 data_alloc: 234881024 data_used: 19890176
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109682688 unmapped: 4825088 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109682688 unmapped: 4825088 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109862912 unmapped: 4644864 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9114000/0x0/0x4ffc00000, data 0x28aa64b/0x2968000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 4603904 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 4603904 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1323668 data_alloc: 234881024 data_used: 19959808
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 4603904 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:29 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 4603904 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 4603904 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.284299850s of 13.593171120s, submitted: 156
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 4603904 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f910a000/0x0/0x4ffc00000, data 0x28b464b/0x2972000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109936640 unmapped: 4571136 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1323085 data_alloc: 234881024 data_used: 19959808
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109936640 unmapped: 4571136 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109936640 unmapped: 4571136 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109936640 unmapped: 4571136 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f910a000/0x0/0x4ffc00000, data 0x28b464b/0x2972000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109944832 unmapped: 4562944 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109953024 unmapped: 4554752 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1323005 data_alloc: 234881024 data_used: 19959808
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8ed32c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8ed2f00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8daf4a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109953024 unmapped: 4554752 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8915e00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add800 session 0x55f8c8ee1c20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 3399680 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c7f80f00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8ef8d20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8dafc20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8db1680
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8addc00 session 0x55f8c8063e00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c89145a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111419392 unmapped: 5259264 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cff000/0x0/0x4ffc00000, data 0x2cbd684/0x2d7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111419392 unmapped: 5259264 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111427584 unmapped: 5251072 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1362318 data_alloc: 234881024 data_used: 20353024
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111427584 unmapped: 5251072 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111427584 unmapped: 5251072 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111427584 unmapped: 5251072 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cff000/0x0/0x4ffc00000, data 0x2cbd6bd/0x2d7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111460352 unmapped: 5218304 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111460352 unmapped: 5218304 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1362318 data_alloc: 234881024 data_used: 20353024
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.106805801s of 17.256025314s, submitted: 42
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8ee81e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111476736 unmapped: 5201920 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111484928 unmapped: 5193728 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cff000/0x0/0x4ffc00000, data 0x2cbd6bd/0x2d7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,3,0,3])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113360896 unmapped: 3317760 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114819072 unmapped: 1859584 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114819072 unmapped: 1859584 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1390523 data_alloc: 234881024 data_used: 24223744
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cfe000/0x0/0x4ffc00000, data 0x2cbd6bd/0x2d7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114827264 unmapped: 1851392 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114827264 unmapped: 1851392 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114827264 unmapped: 1851392 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cfe000/0x0/0x4ffc00000, data 0x2cbd6bd/0x2d7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114827264 unmapped: 1851392 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cfe000/0x0/0x4ffc00000, data 0x2cbd6bd/0x2d7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 1843200 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1390523 data_alloc: 234881024 data_used: 24223744
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 1843200 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.404649734s of 10.734865189s, submitted: 13
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cfe000/0x0/0x4ffc00000, data 0x2cbd6bd/0x2d7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 1843200 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 1843200 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 117628928 unmapped: 5013504 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116768768 unmapped: 5873664 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1491925 data_alloc: 234881024 data_used: 25108480
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116768768 unmapped: 5873664 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f81b4000/0x0/0x4ffc00000, data 0x38086bd/0x38c8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f81b4000/0x0/0x4ffc00000, data 0x38086bd/0x38c8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116776960 unmapped: 5865472 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f81b1000/0x0/0x4ffc00000, data 0x380b6bd/0x38cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 5849088 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f81b1000/0x0/0x4ffc00000, data 0x380b6bd/0x38cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f81b1000/0x0/0x4ffc00000, data 0x380b6bd/0x38cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 5849088 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 5849088 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1490653 data_alloc: 234881024 data_used: 25112576
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 6242304 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 6242304 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8190000/0x0/0x4ffc00000, data 0x382c6bd/0x38ec000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.670058250s of 11.008224487s, submitted: 128
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 6242304 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8052960
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8f29c20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f818d000/0x0/0x4ffc00000, data 0x382f6bd/0x38ef000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 6242304 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc8000 session 0x55f8c8ee0000
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 9650176 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337761 data_alloc: 234881024 data_used: 20291584
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 9650176 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c8f28b40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f90fd000/0x0/0x4ffc00000, data 0x28c064b/0x297e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 9650176 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113000448 unmapped: 9641984 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113000448 unmapped: 9641984 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c83f74a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add400 session 0x55f8c8ee8780
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113008640 unmapped: 9633792 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1338625 data_alloc: 234881024 data_used: 20291584
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f90fd000/0x0/0x4ffc00000, data 0x28c064b/0x297e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,1,1])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c7e0fa40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb0af000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1007743 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb0af000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb0af000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1007743 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.560182571s of 20.807836533s, submitted: 93
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101662720 unmapped: 20979712 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb0af000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101662720 unmapped: 20979712 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1010767 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101662720 unmapped: 20979712 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb0af000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1009585 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb0af000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1009585 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb0af000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc8000 session 0x55f8c82510e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8250000
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c61a70e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c7f98d20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.574503899s of 13.586823463s, submitted: 4
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add400 session 0x55f8c7f990e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc8400 session 0x55f8c83f7680
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102006784 unmapped: 26935296 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c89145a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c8915860
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c8914780
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102006784 unmapped: 26935296 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102006784 unmapped: 26935296 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102014976 unmapped: 26927104 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1040148 data_alloc: 218103808 data_used: 5046272
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add400 session 0x55f8c89141e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fad2b000/0x0/0x4ffc00000, data 0xc916bd/0xd51000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc8400 session 0x55f8c8efb860
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102014976 unmapped: 26927104 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102014976 unmapped: 26927104 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8efbe00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c8efa3c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fad2a000/0x0/0x4ffc00000, data 0xc916cd/0xd52000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068351 data_alloc: 218103808 data_used: 8724480
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fad2a000/0x0/0x4ffc00000, data 0xc916cd/0xd52000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068351 data_alloc: 218103808 data_used: 8724480
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fad2a000/0x0/0x4ffc00000, data 0xc916cd/0xd52000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102072320 unmapped: 26869760 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102072320 unmapped: 26869760 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102072320 unmapped: 26869760 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068351 data_alloc: 218103808 data_used: 8724480
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.456418991s of 18.609983444s, submitted: 33
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 24199168 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa60d000/0x0/0x4ffc00000, data 0xf9e6cd/0x105f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104775680 unmapped: 24166400 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104775680 unmapped: 24166400 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104783872 unmapped: 24158208 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104783872 unmapped: 24158208 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100589 data_alloc: 218103808 data_used: 8921088
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100589 data_alloc: 218103808 data_used: 8921088
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 24141824 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 24141824 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100589 data_alloc: 218103808 data_used: 8921088
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 24141824 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 24141824 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 24141824 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100589 data_alloc: 218103808 data_used: 8921088
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100589 data_alloc: 218103808 data_used: 8921088
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.872806549s of 25.992950439s, submitted: 43
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c8efb4a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add400 session 0x55f8c731f680
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc8800 session 0x55f8c8dae3c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1019398 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1019398 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1019398 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1019398 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1019398 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8ee0000
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c8ee1860
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c8ee01e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add400 session 0x55f8c8ee1c20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.310274124s of 28.440805435s, submitted: 35
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc8c00 session 0x55f8c8ee14a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c6ca0d20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103276544 unmapped: 36167680 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1133149 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103276544 unmapped: 36167680 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103276544 unmapped: 36167680 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103276544 unmapped: 36167680 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103284736 unmapped: 36159488 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc5000/0x0/0x4ffc00000, data 0x17e86ad/0x18a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103284736 unmapped: 36159488 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1133149 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103292928 unmapped: 36151296 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c8db1680
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8f2dc20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103301120 unmapped: 36143104 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc4000/0x0/0x4ffc00000, data 0x17e86d0/0x18a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103301120 unmapped: 36143104 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111468544 unmapped: 27975680 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc4000/0x0/0x4ffc00000, data 0x17e86d0/0x18a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111484928 unmapped: 27959296 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1240039 data_alloc: 234881024 data_used: 20131840
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc4000/0x0/0x4ffc00000, data 0x17e86d0/0x18a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111517696 unmapped: 27926528 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111517696 unmapped: 27926528 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111517696 unmapped: 27926528 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc4000/0x0/0x4ffc00000, data 0x17e86d0/0x18a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111517696 unmapped: 27926528 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111525888 unmapped: 27918336 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1240039 data_alloc: 234881024 data_used: 20131840
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111525888 unmapped: 27918336 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111525888 unmapped: 27918336 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc4000/0x0/0x4ffc00000, data 0x17e86d0/0x18a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111525888 unmapped: 27918336 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.866819382s of 19.239244461s, submitted: 49
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 123478016 unmapped: 15966208 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121561088 unmapped: 17883136 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1411225 data_alloc: 234881024 data_used: 22036480
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 17825792 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 17825792 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 17825792 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f88c6000/0x0/0x4ffc00000, data 0x2ce66d0/0x2da6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 17801216 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 17801216 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1415017 data_alloc: 234881024 data_used: 22257664
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 17801216 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 17801216 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f88c4000/0x0/0x4ffc00000, data 0x2ce86d0/0x2da8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 17752064 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 17752064 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.175070763s of 10.511266708s, submitted: 180
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 17752064 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1412699 data_alloc: 234881024 data_used: 22331392
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 17752064 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f88c2000/0x0/0x4ffc00000, data 0x2cea6d0/0x2daa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 17719296 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 17719296 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 17719296 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 17719296 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1412931 data_alloc: 234881024 data_used: 22331392
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 17719296 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9400 session 0x55f8c61a7860
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9800 session 0x55f8c826b4a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c7de23c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c7de2b40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c7de2780
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 20848640 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7f1c000/0x0/0x4ffc00000, data 0x36906d0/0x3750000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 20815872 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 20815872 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7f1c000/0x0/0x4ffc00000, data 0x36906d0/0x3750000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 20783104 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1484749 data_alloc: 234881024 data_used: 22331392
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 20783104 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7f1c000/0x0/0x4ffc00000, data 0x36906d0/0x3750000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 20783104 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.692651749s of 13.755259514s, submitted: 15
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9400 session 0x55f8c6c14960
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 20480000 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 20455424 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 12771328 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1547897 data_alloc: 251658240 data_used: 31326208
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 12771328 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7ef8000/0x0/0x4ffc00000, data 0x36b46d0/0x3774000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7ef8000/0x0/0x4ffc00000, data 0x36b46d0/0x3774000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 12771328 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 12771328 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 12689408 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 12689408 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1547897 data_alloc: 251658240 data_used: 31326208
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7ef8000/0x0/0x4ffc00000, data 0x36b46d0/0x3774000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 12689408 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 12689408 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7ef8000/0x0/0x4ffc00000, data 0x36b46d0/0x3774000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 12656640 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 12656640 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.174408913s of 12.192111969s, submitted: 6
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 130654208 unmapped: 11943936 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1628439 data_alloc: 251658240 data_used: 31383552
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131645440 unmapped: 10952704 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131006464 unmapped: 11591680 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131006464 unmapped: 11591680 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7471000/0x0/0x4ffc00000, data 0x413b6d0/0x41fb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 11493376 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 11493376 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1634727 data_alloc: 251658240 data_used: 31793152
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c7f62d20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 11493376 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7471000/0x0/0x4ffc00000, data 0x413b6d0/0x41fb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 11493376 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7471000/0x0/0x4ffc00000, data 0x413b6d0/0x41fb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 11493376 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 11493376 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c6c141e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cb69a000 session 0x55f8c6112780
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 11501568 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.638808250s of 10.160227776s, submitted: 89
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1633935 data_alloc: 251658240 data_used: 31793152
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c7471a40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125550592 unmapped: 17047552 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f889e000/0x0/0x4ffc00000, data 0x2d0e6d0/0x2dce000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125550592 unmapped: 17047552 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f88c2000/0x0/0x4ffc00000, data 0x2cea6d0/0x2daa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125550592 unmapped: 17047552 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125550592 unmapped: 17047552 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125550592 unmapped: 17047552 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422843 data_alloc: 234881024 data_used: 22331392
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f88c2000/0x0/0x4ffc00000, data 0x2cea6d0/0x2daa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125550592 unmapped: 17047552 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add400 session 0x55f8c6ca1e00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c8230b40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f88c2000/0x0/0x4ffc00000, data 0x2cea6d0/0x2daa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111493120 unmapped: 31105024 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7e7a000 session 0x55f8c8ee90e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1050415 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa95e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1050415 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa95e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.673280716s of 18.836557388s, submitted: 65
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1049240 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1049240 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1049240 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1049240 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.104139328s of 17.108509064s, submitted: 1
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8251a40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113516544 unmapped: 41156608 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c5746960
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c83f6780
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047400 session 0x55f8c83f7c20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c619c780
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112869376 unmapped: 41803776 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112869376 unmapped: 41803776 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112869376 unmapped: 41803776 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c83f7e00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c83f65a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112869376 unmapped: 41803776 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163518 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7e7a000 session 0x55f8c8ee0f00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc7000/0x0/0x4ffc00000, data 0x17e66ad/0x18a5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047000 session 0x55f8c8ee12c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112893952 unmapped: 41779200 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112893952 unmapped: 41779200 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260752 data_alloc: 234881024 data_used: 18857984
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc6000/0x0/0x4ffc00000, data 0x17e66bd/0x18a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc6000/0x0/0x4ffc00000, data 0x17e66bd/0x18a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260752 data_alloc: 234881024 data_used: 18857984
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc6000/0x0/0x4ffc00000, data 0x17e66bd/0x18a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc6000/0x0/0x4ffc00000, data 0x17e66bd/0x18a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc6000/0x0/0x4ffc00000, data 0x17e66bd/0x18a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.263729095s of 17.429533005s, submitted: 55
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc6000/0x0/0x4ffc00000, data 0x17e66bd/0x18a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 118702080 unmapped: 35971072 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 35717120 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1327086 data_alloc: 234881024 data_used: 19050496
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 35717120 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 35717120 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 35717120 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 35717120 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96de000/0x0/0x4ffc00000, data 0x1ecd6bd/0x1f8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 35717120 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1327238 data_alloc: 234881024 data_used: 19054592
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96c0000/0x0/0x4ffc00000, data 0x1eec6bd/0x1fac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96c0000/0x0/0x4ffc00000, data 0x1eec6bd/0x1fac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325086 data_alloc: 234881024 data_used: 19058688
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.177231789s of 13.388985634s, submitted: 86
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325222 data_alloc: 234881024 data_used: 19058688
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96b3000/0x0/0x4ffc00000, data 0x1ef96bd/0x1fb9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96b3000/0x0/0x4ffc00000, data 0x1ef96bd/0x1fb9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96b3000/0x0/0x4ffc00000, data 0x1ef96bd/0x1fb9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326126 data_alloc: 234881024 data_used: 19066880
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1f036bd/0x1fc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1f036bd/0x1fc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.655777931s of 12.032317162s, submitted: 4
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326518 data_alloc: 234881024 data_used: 19066880
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c731f680
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7e7a000 session 0x55f8c731f0e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c731ef00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1c00 session 0x55f8c731e780
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96a0000/0x0/0x4ffc00000, data 0x1f0c6bd/0x1fcc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,4])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 27844608 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add400 session 0x55f8c8053a40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 35176448 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 8458 writes, 34K keys, 8458 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 8458 writes, 2073 syncs, 4.08 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2428 writes, 9161 keys, 2428 commit groups, 1.0 writes per commit group, ingest: 9.91 MB, 0.02 MB/s#012Interval WAL: 2428 writes, 973 syncs, 2.50 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 35176448 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8d0d000/0x0/0x4ffc00000, data 0x289f6bd/0x295f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 35176448 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1399312 data_alloc: 234881024 data_used: 19066880
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c8ee81e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 35176448 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7e7a000 session 0x55f8c8ee9c20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 35176448 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8ee85a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1c00 session 0x55f8c82305a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119824384 unmapped: 34848768 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119824384 unmapped: 34848768 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124633088 unmapped: 30040064 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1470527 data_alloc: 234881024 data_used: 25440256
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8ce8000/0x0/0x4ffc00000, data 0x28c36cd/0x2984000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.146208763s of 12.293154716s, submitted: 25
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 30023680 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8ce5000/0x0/0x4ffc00000, data 0x28c66cd/0x2987000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 30023680 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8ce5000/0x0/0x4ffc00000, data 0x28c66cd/0x2987000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8ce5000/0x0/0x4ffc00000, data 0x28c66cd/0x2987000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 30023680 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 30023680 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 30023680 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1470415 data_alloc: 234881024 data_used: 25440256
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 30023680 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124698624 unmapped: 29974528 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124698624 unmapped: 29974528 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cdf000/0x0/0x4ffc00000, data 0x28cc6cd/0x298d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124698624 unmapped: 29974528 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128163840 unmapped: 26509312 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555717 data_alloc: 234881024 data_used: 26214400
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.894871712s of 10.098556519s, submitted: 67
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127557632 unmapped: 27115520 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128008192 unmapped: 26664960 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f823d000/0x0/0x4ffc00000, data 0x33666cd/0x3427000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f823a000/0x0/0x4ffc00000, data 0x33696cd/0x342a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128073728 unmapped: 26599424 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f823a000/0x0/0x4ffc00000, data 0x33696cd/0x342a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f823a000/0x0/0x4ffc00000, data 0x33696cd/0x342a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1564765 data_alloc: 234881024 data_used: 26984448
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f823a000/0x0/0x4ffc00000, data 0x33696cd/0x342a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1561573 data_alloc: 234881024 data_used: 26984448
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f823d000/0x0/0x4ffc00000, data 0x336e6cd/0x342f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9400 session 0x55f8c7f99e00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.151363373s of 12.191974640s, submitted: 20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c8250b40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c8915680
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336296 data_alloc: 234881024 data_used: 15409152
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f968c000/0x0/0x4ffc00000, data 0x1f206bd/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f968c000/0x0/0x4ffc00000, data 0x1f206bd/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336296 data_alloc: 234881024 data_used: 15409152
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f968c000/0x0/0x4ffc00000, data 0x1f206bd/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f968c000/0x0/0x4ffc00000, data 0x1f206bd/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f968c000/0x0/0x4ffc00000, data 0x1f206bd/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.010532379s of 11.096594810s, submitted: 30
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8ee1860
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8052000
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7e7a000 session 0x55f8c731e3c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1078693 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fab24000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1078693 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fab24000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1078693 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fab24000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1078693 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.836418152s of 16.924983978s, submitted: 44
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114540544 unmapped: 40132608 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114614272 unmapped: 40058880 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113786880 unmapped: 40886272 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8daf680
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 40869888 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 40869888 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1078401 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 40869888 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 40869888 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 40869888 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 40869888 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8052960
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8063e00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c7f99860
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c82303c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8ee94a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c7470960
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c7f62f00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c7f62780
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c7f625a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114089984 unmapped: 40583168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1173436 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa0dc000/0x0/0x4ffc00000, data 0x14d06bd/0x1590000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114089984 unmapped: 40583168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114089984 unmapped: 40583168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114089984 unmapped: 40583168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 40574976 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c7f63c20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 40574976 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c7f63680
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1173436 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c7f63a40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.971698761s of 14.797449112s, submitted: 253
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c6c14960
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114253824 unmapped: 40419328 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa0dc000/0x0/0x4ffc00000, data 0x14d06bd/0x1590000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114253824 unmapped: 40419328 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 38830080 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 38830080 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 38830080 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258880 data_alloc: 234881024 data_used: 17272832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 38830080 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa0b8000/0x0/0x4ffc00000, data 0x14f46bd/0x15b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa0b8000/0x0/0x4ffc00000, data 0x14f46bd/0x15b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 38830080 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 38830080 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1c00 session 0x55f8c6c150e0
Dec  6 05:17:30 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1575270102' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c6c145a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c7e0f680
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa880000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1090772 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.207107544s of 10.267948151s, submitted: 24
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa880000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1090181 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa880000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1090181 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c731e780
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c74712c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c83f7680
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8ee85a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.252075195s of 10.254258156s, submitted: 1
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c83f7860
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c7de2000
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c7f98b40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8ee0f00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8db12c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa380000/0x0/0x4ffc00000, data 0x122d65b/0x12ec000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111353856 unmapped: 43319296 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa380000/0x0/0x4ffc00000, data 0x122d65b/0x12ec000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111353856 unmapped: 43319296 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111353856 unmapped: 43319296 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c83f7c20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111353856 unmapped: 43319296 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c7f805a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161277 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c61a6960
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c80625a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112558080 unmapped: 42115072 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112558080 unmapped: 42115072 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114032640 unmapped: 40640512 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa35b000/0x0/0x4ffc00000, data 0x125166b/0x1311000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c6ca1e00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c7f99c20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa35b000/0x0/0x4ffc00000, data 0x125166b/0x1311000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c7de3860
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 41762816 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 41762816 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098645 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 41762816 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 41762816 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 41762816 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 41762816 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 41762816 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098645 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112918528 unmapped: 41754624 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112918528 unmapped: 41754624 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112918528 unmapped: 41754624 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112918528 unmapped: 41754624 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112918528 unmapped: 41754624 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098645 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112918528 unmapped: 41754624 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098645 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112934912 unmapped: 41738240 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098645 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112934912 unmapped: 41738240 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112934912 unmapped: 41738240 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112934912 unmapped: 41738240 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112943104 unmapped: 41730048 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c83f63c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864ac00 session 0x55f8c8db0b40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8251c20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c83f65a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.693813324s of 33.888053894s, submitted: 49
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113008640 unmapped: 45342720 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c7dd03c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c6c15e00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864a800 session 0x55f8c8ee1a40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8ee8d20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8ed30e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182204 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112975872 unmapped: 45375488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112975872 unmapped: 45375488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa240000/0x0/0x4ffc00000, data 0x136c6bd/0x142c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112975872 unmapped: 45375488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112975872 unmapped: 45375488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c8ee0780
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112975872 unmapped: 45375488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182204 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c7f80960
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864a400 session 0x55f8c61a7c20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8ee90e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 45359104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113000448 unmapped: 45350912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa240000/0x0/0x4ffc00000, data 0x136c6bd/0x142c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113008640 unmapped: 45342720 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115015680 unmapped: 43335680 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115015680 unmapped: 43335680 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255457 data_alloc: 234881024 data_used: 15900672
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa240000/0x0/0x4ffc00000, data 0x136c6bd/0x142c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115015680 unmapped: 43335680 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa240000/0x0/0x4ffc00000, data 0x136c6bd/0x142c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115015680 unmapped: 43335680 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115015680 unmapped: 43335680 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c7de2f00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115015680 unmapped: 43335680 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa240000/0x0/0x4ffc00000, data 0x136c6bd/0x142c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115023872 unmapped: 43327488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255457 data_alloc: 234881024 data_used: 15900672
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115023872 unmapped: 43327488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115023872 unmapped: 43327488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa240000/0x0/0x4ffc00000, data 0x136c6bd/0x142c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: mgrc ms_handle_reset ms_handle_reset con 0x55f8c7359400
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3885409716
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3885409716,v1:192.168.122.100:6801/3885409716]
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: mgrc handle_mgr_configure stats_period=5
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 43180032 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.289899826s of 18.431455612s, submitted: 40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa240000/0x0/0x4ffc00000, data 0x136c6bd/0x142c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c6112f00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 35749888 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 36519936 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1330523 data_alloc: 234881024 data_used: 16965632
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9a33000/0x0/0x4ffc00000, data 0x1b736bd/0x1c33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 36519936 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 36519936 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 36519936 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 36519936 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9a21000/0x0/0x4ffc00000, data 0x1b836bd/0x1c43000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 36519936 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1330523 data_alloc: 234881024 data_used: 16965632
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 36519936 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9a21000/0x0/0x4ffc00000, data 0x1b836bd/0x1c43000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 36511744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 36511744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 36511744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 36511744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1332035 data_alloc: 234881024 data_used: 16965632
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9a21000/0x0/0x4ffc00000, data 0x1b836bd/0x1c43000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 36503552 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 36503552 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.663705826s of 13.881211281s, submitted: 81
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8461860
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c8098780
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c80521e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1113225 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa95a000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1115366 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.681127548s of 10.794657707s, submitted: 41
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114775 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113975296 unmapped: 44376064 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113975296 unmapped: 44376064 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113975296 unmapped: 44376064 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114775 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113975296 unmapped: 44376064 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113975296 unmapped: 44376064 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113975296 unmapped: 44376064 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113975296 unmapped: 44376064 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c7f814a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113983488 unmapped: 44367872 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c826ab40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864a000 session 0x55f8c7f99e00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114775 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c83cbc00 session 0x55f8c7f99680
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.544075012s of 12.550822258s, submitted: 2
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8efa3c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8053e00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864a000 session 0x55f8c8ee0f00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c8daf2c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f000 session 0x55f8c7e0e960
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 45801472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa4e1000/0x0/0x4ffc00000, data 0x10cd64b/0x118b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 45801472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 45801472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c7f625a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa4e1000/0x0/0x4ffc00000, data 0x10cd64b/0x118b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 45817856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8daeb40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f000 session 0x55f8c7f98780
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 45817856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864a000 session 0x55f8c7f98b40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184612 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112541696 unmapped: 45809664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c7f63c20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112558080 unmapped: 45793280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114581504 unmapped: 43769856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa4e0000/0x0/0x4ffc00000, data 0x10cd65b/0x118c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114581504 unmapped: 43769856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114581504 unmapped: 43769856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1230344 data_alloc: 234881024 data_used: 11735040
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114581504 unmapped: 43769856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa4e0000/0x0/0x4ffc00000, data 0x10cd65b/0x118c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 43761664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 43761664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 43761664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa4e0000/0x0/0x4ffc00000, data 0x10cd65b/0x118c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa4e0000/0x0/0x4ffc00000, data 0x10cd65b/0x118c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 43761664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1230344 data_alloc: 234881024 data_used: 11735040
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114597888 unmapped: 43753472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.516071320s of 16.788757324s, submitted: 36
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114614272 unmapped: 43737088 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c8efab40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8db01e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c6c145a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f000 session 0x55f8c83f72c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864a000 session 0x55f8c8efaf00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125050880 unmapped: 33300480 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9d40000/0x0/0x4ffc00000, data 0x186c6bd/0x192c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125149184 unmapped: 33202176 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 123346944 unmapped: 35004416 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1350212 data_alloc: 234881024 data_used: 13533184
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864a000 session 0x55f8c84612c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 123346944 unmapped: 35004416 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c84614a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8460f00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f000 session 0x55f8c8461680
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 123371520 unmapped: 34979840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9902000/0x0/0x4ffc00000, data 0x1ca96bd/0x1d69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 123371520 unmapped: 34979840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124461056 unmapped: 33890304 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9901000/0x0/0x4ffc00000, data 0x1ca96f0/0x1d6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125968384 unmapped: 32382976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1378048 data_alloc: 234881024 data_used: 17047552
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125976576 unmapped: 32374784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125976576 unmapped: 32374784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f98e0000/0x0/0x4ffc00000, data 0x1cca6f0/0x1d8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125976576 unmapped: 32374784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125976576 unmapped: 32374784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.158753395s of 12.559136391s, submitted: 160
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125976576 unmapped: 32374784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1377706 data_alloc: 234881024 data_used: 17051648
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125976576 unmapped: 32374784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f98e0000/0x0/0x4ffc00000, data 0x1cca6f0/0x1d8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125984768 unmapped: 32366592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125984768 unmapped: 32366592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125984768 unmapped: 32366592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 134922240 unmapped: 23429120 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1478226 data_alloc: 234881024 data_used: 18644992
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133218304 unmapped: 25133056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f89b7000/0x0/0x4ffc00000, data 0x27dd6f0/0x289f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133218304 unmapped: 25133056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133218304 unmapped: 25133056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f89b4000/0x0/0x4ffc00000, data 0x27e06f0/0x28a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133218304 unmapped: 25133056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133218304 unmapped: 25133056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1475920 data_alloc: 234881024 data_used: 18571264
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133218304 unmapped: 25133056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.758771896s of 12.029929161s, submitted: 137
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133226496 unmapped: 25124864 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133234688 unmapped: 25116672 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f89b4000/0x0/0x4ffc00000, data 0x27e06f0/0x28a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133234688 unmapped: 25116672 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133251072 unmapped: 25100288 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1477304 data_alloc: 234881024 data_used: 18571264
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133251072 unmapped: 25100288 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133251072 unmapped: 25100288 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133259264 unmapped: 25092096 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133259264 unmapped: 25092096 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f89b4000/0x0/0x4ffc00000, data 0x27e66f0/0x28a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133267456 unmapped: 25083904 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1475112 data_alloc: 234881024 data_used: 18575360
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c8461e00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1b400 session 0x55f8c80621e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8ee94a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 28917760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f89b2000/0x0/0x4ffc00000, data 0x27e76f0/0x28a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 28917760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 28917760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 28917760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 28917760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1322552 data_alloc: 234881024 data_used: 13017088
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.765671730s of 13.941443443s, submitted: 70
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 28893184 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f984e000/0x0/0x4ffc00000, data 0x194f65b/0x1a0e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 28893184 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c74712c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f800 session 0x55f8c82303c0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 28884992 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8daf4a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa88f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1140117 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa88f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa88f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa88f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1140117 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa88f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1140117 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124420096 unmapped: 33931264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124420096 unmapped: 33931264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124420096 unmapped: 33931264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa88f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124420096 unmapped: 33931264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1140117 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124420096 unmapped: 33931264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124428288 unmapped: 33923072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124428288 unmapped: 33923072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124428288 unmapped: 33923072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa88f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124428288 unmapped: 33923072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1140117 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124428288 unmapped: 33923072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124428288 unmapped: 33923072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124436480 unmapped: 33914880 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8ef8960
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8ef81e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1b400 session 0x55f8c8ef9860
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f800 session 0x55f8c8ef8b40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124436480 unmapped: 33914880 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.756196976s of 28.830261230s, submitted: 31
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c8251c20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c89145a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8063e00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c619cb40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1b400 session 0x55f8c8063a40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 33873920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1231334 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9d7e000/0x0/0x4ffc00000, data 0x141e6bd/0x14de000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 33873920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 33873920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 33873920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9d7e000/0x0/0x4ffc00000, data 0x141e6bd/0x14de000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 33873920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f800 session 0x55f8c7de3860
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 33873920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1232858 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 33865728 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9d5a000/0x0/0x4ffc00000, data 0x14426bd/0x1502000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 33865728 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126795776 unmapped: 31555584 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126795776 unmapped: 31555584 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126795776 unmapped: 31555584 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1309770 data_alloc: 234881024 data_used: 16269312
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126795776 unmapped: 31555584 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9d5a000/0x0/0x4ffc00000, data 0x14426bd/0x1502000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 31522816 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 31522816 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9d5a000/0x0/0x4ffc00000, data 0x14426bd/0x1502000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 31522816 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 31522816 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1309770 data_alloc: 234881024 data_used: 16269312
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 31522816 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9d5a000/0x0/0x4ffc00000, data 0x14426bd/0x1502000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 31522816 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.391180038s of 18.505029678s, submitted: 40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 135880704 unmapped: 22470656 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8089000/0x0/0x4ffc00000, data 0x1f6b6bd/0x202b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 23281664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 23281664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1417360 data_alloc: 234881024 data_used: 18153472
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 23281664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 23281664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 23281664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8007000/0x0/0x4ffc00000, data 0x1ff56bd/0x20b5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 23281664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7fe6000/0x0/0x4ffc00000, data 0x20166bd/0x20d6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133750784 unmapped: 24600576 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1417224 data_alloc: 234881024 data_used: 18153472
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133750784 unmapped: 24600576 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7fe6000/0x0/0x4ffc00000, data 0x20166bd/0x20d6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133750784 unmapped: 24600576 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133750784 unmapped: 24600576 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7fe6000/0x0/0x4ffc00000, data 0x20166bd/0x20d6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133783552 unmapped: 24567808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133783552 unmapped: 24567808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1417528 data_alloc: 234881024 data_used: 18161664
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133783552 unmapped: 24567808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.803490639s of 14.055315018s, submitted: 131
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7fe6000/0x0/0x4ffc00000, data 0x20166bd/0x20d6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133816320 unmapped: 24535040 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133816320 unmapped: 24535040 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f800 session 0x55f8c8250780
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c82501e0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7fe0000/0x0/0x4ffc00000, data 0x201c6bd/0x20dc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8daef00
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8efa5a0
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127066112 unmapped: 31285248 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127066112 unmapped: 31285248 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127066112 unmapped: 31285248 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127066112 unmapped: 31285248 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127066112 unmapped: 31285248 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127074304 unmapped: 31277056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127074304 unmapped: 31277056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127074304 unmapped: 31277056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127074304 unmapped: 31277056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7e7a400 session 0x55f8c6ca1a40
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127074304 unmapped: 31277056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127074304 unmapped: 31277056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126599168 unmapped: 31752192 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126599168 unmapped: 31752192 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126607360 unmapped: 31744000 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126607360 unmapped: 31744000 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126607360 unmapped: 31744000 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126607360 unmapped: 31744000 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126607360 unmapped: 31744000 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126623744 unmapped: 31727616 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126623744 unmapped: 31727616 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126623744 unmapped: 31727616 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126623744 unmapped: 31727616 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126623744 unmapped: 31727616 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126631936 unmapped: 31719424 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126631936 unmapped: 31719424 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff0c00 session 0x55f8c8ef9c20
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126631936 unmapped: 31719424 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126631936 unmapped: 31719424 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126631936 unmapped: 31719424 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126631936 unmapped: 31719424 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126664704 unmapped: 31686656 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126664704 unmapped: 31686656 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: do_command 'config diff' '{prefix=config diff}'
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: do_command 'config show' '{prefix=config show}'
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: do_command 'counter dump' '{prefix=counter dump}'
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: do_command 'counter schema' '{prefix=counter schema}'
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126738432 unmapped: 31612928 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126631936 unmapped: 31719424 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:17:30 np0005548918 ceph-osd[78376]: do_command 'log dump' '{prefix=log dump}'
Dec  6 05:17:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  6 05:17:30 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3684677759' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  6 05:17:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:30 np0005548918 nova_compute[229246]: 2025-12-06 10:17:30.389 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  6 05:17:30 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/96602646' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  6 05:17:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  6 05:17:30 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1967292708' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  6 05:17:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:31 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  6 05:17:31 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2642598236' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  6 05:17:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:31.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:31 np0005548918 nova_compute[229246]: 2025-12-06 10:17:31.236 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:31.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:31 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec  6 05:17:31 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1553814133' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec  6 05:17:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:32 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Dec  6 05:17:32 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4055719220' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec  6 05:17:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:32 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec  6 05:17:32 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1749799299' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec  6 05:17:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:33.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec  6 05:17:33 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4191974410' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec  6 05:17:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec  6 05:17:33 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2079500196' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec  6 05:17:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:17:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:33.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:17:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec  6 05:17:33 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1400424366' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec  6 05:17:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec  6 05:17:34 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/916825396' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec  6 05:17:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec  6 05:17:34 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3574515276' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec  6 05:17:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec  6 05:17:34 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3824839772' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec  6 05:17:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec  6 05:17:34 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2022830260' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec  6 05:17:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec  6 05:17:34 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2982586928' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec  6 05:17:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec  6 05:17:34 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4038371396' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec  6 05:17:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec  6 05:17:34 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/508111638' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec  6 05:17:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec  6 05:17:35 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2990078285' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec  6 05:17:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:35.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:35 np0005548918 systemd[1]: Starting Hostname Service...
Dec  6 05:17:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec  6 05:17:35 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/487764021' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec  6 05:17:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:35.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:35 np0005548918 systemd[1]: Started Hostname Service.
Dec  6 05:17:35 np0005548918 nova_compute[229246]: 2025-12-06 10:17:35.389 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:35 np0005548918 podman[245059]: 2025-12-06 10:17:35.442359029 +0000 UTC m=+0.095335861 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  6 05:17:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec  6 05:17:35 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4113509356' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec  6 05:17:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec  6 05:17:35 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/394227616' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec  6 05:17:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:36 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec  6 05:17:36 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2789134809' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec  6 05:17:36 np0005548918 nova_compute[229246]: 2025-12-06 10:17:36.238 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:17:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:37.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:17:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:37.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:37 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec  6 05:17:37 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2804408455' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec  6 05:17:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:37 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Dec  6 05:17:37 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2453137415' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec  6 05:17:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec  6 05:17:38 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2392546310' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  6 05:17:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec  6 05:17:38 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3607230830' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec  6 05:17:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:39 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  6 05:17:39 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  6 05:17:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:39.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:39 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec  6 05:17:39 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec  6 05:17:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:17:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:39.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:17:39 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  6 05:17:39 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  6 05:17:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:39 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec  6 05:17:39 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec  6 05:17:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:39 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Dec  6 05:17:39 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1975698630' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec  6 05:17:40 np0005548918 nova_compute[229246]: 2025-12-06 10:17:40.390 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2870637964' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec  6 05:17:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:40.847109) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260847132, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2599, "num_deletes": 251, "total_data_size": 6459693, "memory_usage": 6544096, "flush_reason": "Manual Compaction"}
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260874359, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4202135, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31303, "largest_seqno": 33897, "table_properties": {"data_size": 4191058, "index_size": 6931, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 26531, "raw_average_key_size": 21, "raw_value_size": 4167766, "raw_average_value_size": 3421, "num_data_blocks": 296, "num_entries": 1218, "num_filter_entries": 1218, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016059, "oldest_key_time": 1765016059, "file_creation_time": 1765016260, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 27433 microseconds, and 7307 cpu microseconds.
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:40.874541) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4202135 bytes OK
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:40.874627) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:40.877247) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:40.877259) EVENT_LOG_v1 {"time_micros": 1765016260877255, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:40.877273) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6447513, prev total WAL file size 6447513, number of live WAL files 2.
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:40.879004) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(4103KB)], [60(12MB)]
Dec  6 05:17:40 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260879061, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 17353326, "oldest_snapshot_seqno": -1}
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6529 keys, 15136338 bytes, temperature: kUnknown
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016261016508, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 15136338, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15092828, "index_size": 26054, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16389, "raw_key_size": 167520, "raw_average_key_size": 25, "raw_value_size": 14975390, "raw_average_value_size": 2293, "num_data_blocks": 1047, "num_entries": 6529, "num_filter_entries": 6529, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765016260, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:41.016887) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 15136338 bytes
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:41.019039) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 126.0 rd, 109.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.5 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(7.7) write-amplify(3.6) OK, records in: 7050, records dropped: 521 output_compression: NoCompression
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:41.019058) EVENT_LOG_v1 {"time_micros": 1765016261019049, "job": 36, "event": "compaction_finished", "compaction_time_micros": 137671, "compaction_time_cpu_micros": 27006, "output_level": 6, "num_output_files": 1, "total_output_size": 15136338, "num_input_records": 7050, "num_output_records": 6529, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016261020090, "job": 36, "event": "table_file_deletion", "file_number": 62}
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016261022502, "job": 36, "event": "table_file_deletion", "file_number": 60}
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:40.878912) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:41.022621) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:41.022625) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:41.022627) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:41.022630) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:17:41.022631) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:17:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:41.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2109662307' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec  6 05:17:41 np0005548918 nova_compute[229246]: 2025-12-06 10:17:41.239 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:17:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:41.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Dec  6 05:17:41 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3040169991' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec  6 05:17:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:42 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Dec  6 05:17:42 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2211086827' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec  6 05:17:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:42 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Dec  6 05:17:42 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1694569205' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec  6 05:17:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:43.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:43.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:43 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Dec  6 05:17:43 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1443159653' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec  6 05:17:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Dec  6 05:17:44 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1515055382' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec  6 05:17:44 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Dec  6 05:17:44 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1000968456' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec  6 05:17:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:44 np0005548918 ovs-appctl[246858]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  6 05:17:44 np0005548918 ovs-appctl[246863]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  6 05:17:45 np0005548918 ovs-appctl[246868]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  6 05:17:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:45.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Dec  6 05:17:45 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2038668301' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec  6 05:17:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:45.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:45 np0005548918 nova_compute[229246]: 2025-12-06 10:17:45.391 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:45 np0005548918 nova_compute[229246]: 2025-12-06 10:17:45.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:46 np0005548918 nova_compute[229246]: 2025-12-06 10:17:46.958 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:47 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  6 05:17:47 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2700179710' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  6 05:17:47 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  6 05:17:47 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2700179710' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  6 05:17:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:47.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:47.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:47 np0005548918 nova_compute[229246]: 2025-12-06 10:17:47.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:47 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Dec  6 05:17:47 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2607835880' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Dec  6 05:17:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:48 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Dec  6 05:17:48 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/950858382' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Dec  6 05:17:48 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Dec  6 05:17:48 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/820439724' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Dec  6 05:17:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:49.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:49.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:49 np0005548918 nova_compute[229246]: 2025-12-06 10:17:49.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:49 np0005548918 nova_compute[229246]: 2025-12-06 10:17:49.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:49 np0005548918 nova_compute[229246]: 2025-12-06 10:17:49.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:17:49 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec  6 05:17:49 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1985334892' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  6 05:17:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Dec  6 05:17:50 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1751496017' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Dec  6 05:17:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:50 np0005548918 nova_compute[229246]: 2025-12-06 10:17:50.393 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Dec  6 05:17:50 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2380510369' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Dec  6 05:17:50 np0005548918 nova_compute[229246]: 2025-12-06 10:17:50.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:17:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:51.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:17:51 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Dec  6 05:17:51 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2132475846' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  6 05:17:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:51.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:51 np0005548918 nova_compute[229246]: 2025-12-06 10:17:51.962 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:52 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Dec  6 05:17:52 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2505701622' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Dec  6 05:17:52 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Dec  6 05:17:52 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2093660645' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Dec  6 05:17:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:17:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:53.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:17:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:53.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:53 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Dec  6 05:17:53 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2941709570' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Dec  6 05:17:53 np0005548918 nova_compute[229246]: 2025-12-06 10:17:53.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:17:53.686 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:17:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:17:53.686 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:17:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:17:53.686 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:17:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:53 np0005548918 podman[248746]: 2025-12-06 10:17:53.764785747 +0000 UTC m=+0.144932093 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  6 05:17:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:53 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Dec  6 05:17:53 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1855919859' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Dec  6 05:17:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:54 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Dec  6 05:17:54 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1663530059' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Dec  6 05:17:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:55.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:17:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:55.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:55 np0005548918 nova_compute[229246]: 2025-12-06 10:17:55.395 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:55 np0005548918 nova_compute[229246]: 2025-12-06 10:17:55.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:55 np0005548918 nova_compute[229246]: 2025-12-06 10:17:55.564 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:17:55 np0005548918 nova_compute[229246]: 2025-12-06 10:17:55.565 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:17:55 np0005548918 nova_compute[229246]: 2025-12-06 10:17:55.565 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:17:55 np0005548918 nova_compute[229246]: 2025-12-06 10:17:55.565 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:17:55 np0005548918 nova_compute[229246]: 2025-12-06 10:17:55.565 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:17:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:17:55 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/34457584' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:17:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Dec  6 05:17:55 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4020160728' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Dec  6 05:17:56 np0005548918 nova_compute[229246]: 2025-12-06 10:17:56.009 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:17:56 np0005548918 virtqemud[228866]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  6 05:17:56 np0005548918 nova_compute[229246]: 2025-12-06 10:17:56.180 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:17:56 np0005548918 nova_compute[229246]: 2025-12-06 10:17:56.181 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4626MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:17:56 np0005548918 nova_compute[229246]: 2025-12-06 10:17:56.181 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:17:56 np0005548918 nova_compute[229246]: 2025-12-06 10:17:56.181 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:17:56 np0005548918 nova_compute[229246]: 2025-12-06 10:17:56.257 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:17:56 np0005548918 nova_compute[229246]: 2025-12-06 10:17:56.257 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:17:56 np0005548918 nova_compute[229246]: 2025-12-06 10:17:56.312 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:17:56 np0005548918 systemd[1]: Starting Time & Date Service...
Dec  6 05:17:56 np0005548918 systemd[1]: Started Time & Date Service.
Dec  6 05:17:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:56 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:17:56 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3394423495' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:17:56 np0005548918 nova_compute[229246]: 2025-12-06 10:17:56.752 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:17:56 np0005548918 nova_compute[229246]: 2025-12-06 10:17:56.759 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:17:56 np0005548918 nova_compute[229246]: 2025-12-06 10:17:56.780 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:17:56 np0005548918 nova_compute[229246]: 2025-12-06 10:17:56.782 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:17:56 np0005548918 nova_compute[229246]: 2025-12-06 10:17:56.782 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:17:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:56 np0005548918 nova_compute[229246]: 2025-12-06 10:17:56.964 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:17:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:17:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:57.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:17:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:57.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:57 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Dec  6 05:17:57 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3263841297' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  6 05:17:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:57 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Dec  6 05:17:57 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2354413920' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec  6 05:17:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:57 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Dec  6 05:17:57 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3483261157' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec  6 05:17:58 np0005548918 podman[249555]: 2025-12-06 10:17:58.215207416 +0000 UTC m=+0.084548928 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec  6 05:17:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:58 np0005548918 nova_compute[229246]: 2025-12-06 10:17:58.778 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:58 np0005548918 nova_compute[229246]: 2025-12-06 10:17:58.778 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:17:58 np0005548918 nova_compute[229246]: 2025-12-06 10:17:58.779 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:17:58 np0005548918 nova_compute[229246]: 2025-12-06 10:17:58.779 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:17:58 np0005548918 nova_compute[229246]: 2025-12-06 10:17:58.810 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:17:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:59 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Dec  6 05:17:59 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2060160573' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec  6 05:17:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:59.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:17:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:17:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:59.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:17:59 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Dec  6 05:17:59 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2859090236' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Dec  6 05:17:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:17:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:17:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:17:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:00 np0005548918 nova_compute[229246]: 2025-12-06 10:18:00.398 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:01.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:01.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:01 np0005548918 nova_compute[229246]: 2025-12-06 10:18:01.967 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:03.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:03.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:05.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:05.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:05 np0005548918 nova_compute[229246]: 2025-12-06 10:18:05.399 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:06 np0005548918 podman[249660]: 2025-12-06 10:18:06.170442564 +0000 UTC m=+0.058614708 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  6 05:18:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:06 np0005548918 nova_compute[229246]: 2025-12-06 10:18:06.969 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:07.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:07.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:09.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:09.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:10 np0005548918 nova_compute[229246]: 2025-12-06 10:18:10.400 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:11.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:11.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:11 np0005548918 nova_compute[229246]: 2025-12-06 10:18:11.973 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:13.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:13.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:15.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:15.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:15 np0005548918 nova_compute[229246]: 2025-12-06 10:18:15.402 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:16 np0005548918 nova_compute[229246]: 2025-12-06 10:18:16.976 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:17.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:17.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:19 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:18:19 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:18:19 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:18:19 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:18:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:19.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:19.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:20 np0005548918 nova_compute[229246]: 2025-12-06 10:18:20.405 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:21.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:21.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:21 np0005548918 nova_compute[229246]: 2025-12-06 10:18:21.980 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:23.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:23.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:24 np0005548918 podman[249829]: 2025-12-06 10:18:24.19872083 +0000 UTC m=+0.082124602 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  6 05:18:24 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:18:24 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:18:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:25.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:25.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:25 np0005548918 nova_compute[229246]: 2025-12-06 10:18:25.405 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:26 np0005548918 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  6 05:18:26 np0005548918 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  6 05:18:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:26 np0005548918 nova_compute[229246]: 2025-12-06 10:18:26.983 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:27.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:27.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:29 np0005548918 podman[249865]: 2025-12-06 10:18:29.161602156 +0000 UTC m=+0.049824987 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  6 05:18:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:29.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:29.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:30 np0005548918 nova_compute[229246]: 2025-12-06 10:18:30.408 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:31.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:31.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:31 np0005548918 nova_compute[229246]: 2025-12-06 10:18:31.986 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:33.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:33.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:35.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:35.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:35 np0005548918 nova_compute[229246]: 2025-12-06 10:18:35.411 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:36 np0005548918 nova_compute[229246]: 2025-12-06 10:18:36.988 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:37 np0005548918 podman[249917]: 2025-12-06 10:18:37.194468191 +0000 UTC m=+0.077574137 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  6 05:18:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:37.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:37.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:39.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:39.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:40 np0005548918 nova_compute[229246]: 2025-12-06 10:18:40.411 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:41.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:41.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:41 np0005548918 nova_compute[229246]: 2025-12-06 10:18:41.992 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:43.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:43.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:45.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:45.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:45 np0005548918 nova_compute[229246]: 2025-12-06 10:18:45.413 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:46 np0005548918 nova_compute[229246]: 2025-12-06 10:18:46.995 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:47.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:47.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:47 np0005548918 nova_compute[229246]: 2025-12-06 10:18:47.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:48 np0005548918 systemd[1]: session-55.scope: Deactivated successfully.
Dec  6 05:18:48 np0005548918 systemd[1]: session-55.scope: Consumed 2min 47.892s CPU time, 708.1M memory peak, read 256.0M from disk, written 201.1M to disk.
Dec  6 05:18:48 np0005548918 systemd-logind[800]: Session 55 logged out. Waiting for processes to exit.
Dec  6 05:18:48 np0005548918 systemd-logind[800]: Removed session 55.
Dec  6 05:18:48 np0005548918 systemd-logind[800]: New session 56 of user zuul.
Dec  6 05:18:48 np0005548918 systemd[1]: Started Session 56 of User zuul.
Dec  6 05:18:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:48 np0005548918 systemd[1]: session-56.scope: Deactivated successfully.
Dec  6 05:18:48 np0005548918 systemd-logind[800]: Session 56 logged out. Waiting for processes to exit.
Dec  6 05:18:48 np0005548918 systemd-logind[800]: Removed session 56.
Dec  6 05:18:48 np0005548918 systemd-logind[800]: New session 57 of user zuul.
Dec  6 05:18:48 np0005548918 systemd[1]: Started Session 57 of User zuul.
Dec  6 05:18:49 np0005548918 systemd[1]: session-57.scope: Deactivated successfully.
Dec  6 05:18:49 np0005548918 systemd-logind[800]: Session 57 logged out. Waiting for processes to exit.
Dec  6 05:18:49 np0005548918 systemd-logind[800]: Removed session 57.
Dec  6 05:18:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:49.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:49.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:49 np0005548918 nova_compute[229246]: 2025-12-06 10:18:49.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:49 np0005548918 nova_compute[229246]: 2025-12-06 10:18:49.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:49 np0005548918 nova_compute[229246]: 2025-12-06 10:18:49.537 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:49 np0005548918 nova_compute[229246]: 2025-12-06 10:18:49.537 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:18:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:50 np0005548918 nova_compute[229246]: 2025-12-06 10:18:50.416 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:51.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:51.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:52 np0005548918 nova_compute[229246]: 2025-12-06 10:18:51.999 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:52 np0005548918 nova_compute[229246]: 2025-12-06 10:18:52.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:53.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:53.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:53 np0005548918 nova_compute[229246]: 2025-12-06 10:18:53.531 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:18:53.687 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:18:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:18:53.688 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:18:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:18:53.688 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:18:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:54 np0005548918 nova_compute[229246]: 2025-12-06 10:18:54.534 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:55.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:55 np0005548918 podman[250015]: 2025-12-06 10:18:55.284994433 +0000 UTC m=+0.161299683 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:18:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:18:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:55.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:55 np0005548918 nova_compute[229246]: 2025-12-06 10:18:55.417 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:57 np0005548918 nova_compute[229246]: 2025-12-06 10:18:57.002 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:18:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:57.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:57.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:57 np0005548918 nova_compute[229246]: 2025-12-06 10:18:57.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:57 np0005548918 nova_compute[229246]: 2025-12-06 10:18:57.565 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:18:57 np0005548918 nova_compute[229246]: 2025-12-06 10:18:57.565 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:18:57 np0005548918 nova_compute[229246]: 2025-12-06 10:18:57.566 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:18:57 np0005548918 nova_compute[229246]: 2025-12-06 10:18:57.566 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:18:57 np0005548918 nova_compute[229246]: 2025-12-06 10:18:57.566 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:18:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:57 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:18:57 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3649759905' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:18:57 np0005548918 nova_compute[229246]: 2025-12-06 10:18:57.993 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:18:58 np0005548918 nova_compute[229246]: 2025-12-06 10:18:58.162 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:18:58 np0005548918 nova_compute[229246]: 2025-12-06 10:18:58.163 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4809MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:18:58 np0005548918 nova_compute[229246]: 2025-12-06 10:18:58.163 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:18:58 np0005548918 nova_compute[229246]: 2025-12-06 10:18:58.163 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:18:58 np0005548918 nova_compute[229246]: 2025-12-06 10:18:58.251 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:18:58 np0005548918 nova_compute[229246]: 2025-12-06 10:18:58.252 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:18:58 np0005548918 nova_compute[229246]: 2025-12-06 10:18:58.272 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:18:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:18:58 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3542070725' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:18:58 np0005548918 nova_compute[229246]: 2025-12-06 10:18:58.739 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:18:58 np0005548918 nova_compute[229246]: 2025-12-06 10:18:58.748 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:18:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:58 np0005548918 nova_compute[229246]: 2025-12-06 10:18:58.768 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:18:58 np0005548918 nova_compute[229246]: 2025-12-06 10:18:58.771 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:18:58 np0005548918 nova_compute[229246]: 2025-12-06 10:18:58.771 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:18:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:18:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:59.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:18:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:18:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:18:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:59.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:18:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:18:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:18:59 np0005548918 nova_compute[229246]: 2025-12-06 10:18:59.772 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:59 np0005548918 nova_compute[229246]: 2025-12-06 10:18:59.772 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:18:59 np0005548918 nova_compute[229246]: 2025-12-06 10:18:59.773 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:18:59 np0005548918 nova_compute[229246]: 2025-12-06 10:18:59.773 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:18:59 np0005548918 nova_compute[229246]: 2025-12-06 10:18:59.793 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:18:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:18:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:00 np0005548918 podman[250115]: 2025-12-06 10:19:00.20664394 +0000 UTC m=+0.093157465 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:19:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:00 np0005548918 nova_compute[229246]: 2025-12-06 10:19:00.420 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:01.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:01.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:02 np0005548918 nova_compute[229246]: 2025-12-06 10:19:02.005 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:03.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:03.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:05.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:05.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:05 np0005548918 nova_compute[229246]: 2025-12-06 10:19:05.421 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:07 np0005548918 nova_compute[229246]: 2025-12-06 10:19:07.008 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:07.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:07.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:08 np0005548918 podman[250143]: 2025-12-06 10:19:08.220129513 +0000 UTC m=+0.100958488 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  6 05:19:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:09.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:09.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:10 np0005548918 nova_compute[229246]: 2025-12-06 10:19:10.424 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:11.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:11.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:12 np0005548918 nova_compute[229246]: 2025-12-06 10:19:12.011 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:13.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:13.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:15.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:15.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:15 np0005548918 nova_compute[229246]: 2025-12-06 10:19:15.426 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:17 np0005548918 nova_compute[229246]: 2025-12-06 10:19:17.013 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:17.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:17.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:19.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:19.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:20 np0005548918 nova_compute[229246]: 2025-12-06 10:19:20.428 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:21.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:19:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:21.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:19:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:22 np0005548918 nova_compute[229246]: 2025-12-06 10:19:22.015 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:23.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:23.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:25 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:25 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:25 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:25 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:25.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:25 np0005548918 nova_compute[229246]: 2025-12-06 10:19:25.429 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:25.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:26 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:19:26 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:26 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:26 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:19:26 np0005548918 podman[250287]: 2025-12-06 10:19:26.199151919 +0000 UTC m=+0.088741043 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec  6 05:19:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:27 np0005548918 nova_compute[229246]: 2025-12-06 10:19:27.018 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:27.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:27.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:29.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:29.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:30 np0005548918 nova_compute[229246]: 2025-12-06 10:19:30.430 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:30 np0005548918 podman[250341]: 2025-12-06 10:19:30.581053011 +0000 UTC m=+0.047552045 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec  6 05:19:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:19:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:31.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:31.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:32 np0005548918 nova_compute[229246]: 2025-12-06 10:19:32.020 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:33.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:19:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:33.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:19:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:35.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:35 np0005548918 nova_compute[229246]: 2025-12-06 10:19:35.431 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:19:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:35.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:19:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:37 np0005548918 nova_compute[229246]: 2025-12-06 10:19:37.023 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  6 05:19:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:37.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  6 05:19:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:19:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:37.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:19:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:39 np0005548918 podman[250395]: 2025-12-06 10:19:39.182151662 +0000 UTC m=+0.069698701 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  6 05:19:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:39.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:39.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:40 np0005548918 nova_compute[229246]: 2025-12-06 10:19:40.434 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:41.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:19:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:41.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:19:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:42 np0005548918 nova_compute[229246]: 2025-12-06 10:19:42.027 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:43.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:43.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:45.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:45 np0005548918 nova_compute[229246]: 2025-12-06 10:19:45.436 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:45.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:47 np0005548918 nova_compute[229246]: 2025-12-06 10:19:47.031 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:47.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:19:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:47.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:19:47 np0005548918 nova_compute[229246]: 2025-12-06 10:19:47.534 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:49.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:49.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:50 np0005548918 nova_compute[229246]: 2025-12-06 10:19:50.440 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:50 np0005548918 nova_compute[229246]: 2025-12-06 10:19:50.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:51.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:19:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:51.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:19:51 np0005548918 nova_compute[229246]: 2025-12-06 10:19:51.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:51 np0005548918 nova_compute[229246]: 2025-12-06 10:19:51.537 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:51 np0005548918 nova_compute[229246]: 2025-12-06 10:19:51.537 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:19:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:52 np0005548918 nova_compute[229246]: 2025-12-06 10:19:52.033 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:19:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:53.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:19:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:53.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:53 np0005548918 nova_compute[229246]: 2025-12-06 10:19:53.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:19:53.688 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:19:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:19:53.689 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:19:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:19:53.689 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:19:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:19:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:55.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:55 np0005548918 nova_compute[229246]: 2025-12-06 10:19:55.441 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:55.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:56 np0005548918 podman[250458]: 2025-12-06 10:19:56.437380518 +0000 UTC m=+0.148454440 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Dec  6 05:19:56 np0005548918 nova_compute[229246]: 2025-12-06 10:19:56.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:57 np0005548918 nova_compute[229246]: 2025-12-06 10:19:57.036 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:19:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:19:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:57.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:19:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:57.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:57 np0005548918 nova_compute[229246]: 2025-12-06 10:19:57.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:19:57 np0005548918 nova_compute[229246]: 2025-12-06 10:19:57.566 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:19:57 np0005548918 nova_compute[229246]: 2025-12-06 10:19:57.566 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:19:57 np0005548918 nova_compute[229246]: 2025-12-06 10:19:57.566 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:19:57 np0005548918 nova_compute[229246]: 2025-12-06 10:19:57.567 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:19:57 np0005548918 nova_compute[229246]: 2025-12-06 10:19:57.567 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:19:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:58 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:19:58 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/502386387' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:19:58 np0005548918 nova_compute[229246]: 2025-12-06 10:19:58.083 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:19:58 np0005548918 nova_compute[229246]: 2025-12-06 10:19:58.265 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:19:58 np0005548918 nova_compute[229246]: 2025-12-06 10:19:58.266 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4834MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:19:58 np0005548918 nova_compute[229246]: 2025-12-06 10:19:58.267 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:19:58 np0005548918 nova_compute[229246]: 2025-12-06 10:19:58.267 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:19:58 np0005548918 nova_compute[229246]: 2025-12-06 10:19:58.543 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:19:58 np0005548918 nova_compute[229246]: 2025-12-06 10:19:58.544 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:19:58 np0005548918 nova_compute[229246]: 2025-12-06 10:19:58.572 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:19:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:59 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:19:59 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4279224093' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:19:59 np0005548918 nova_compute[229246]: 2025-12-06 10:19:59.023 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:19:59 np0005548918 nova_compute[229246]: 2025-12-06 10:19:59.031 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:19:59 np0005548918 nova_compute[229246]: 2025-12-06 10:19:59.053 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:19:59 np0005548918 nova_compute[229246]: 2025-12-06 10:19:59.057 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:19:59 np0005548918 nova_compute[229246]: 2025-12-06 10:19:59.057 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.790s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:19:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:19:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:59.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:19:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:19:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:19:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:59.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:19:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:19:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:19:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:19:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:00 np0005548918 nova_compute[229246]: 2025-12-06 10:20:00.058 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:00 np0005548918 nova_compute[229246]: 2025-12-06 10:20:00.059 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:20:00 np0005548918 nova_compute[229246]: 2025-12-06 10:20:00.059 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:20:00 np0005548918 nova_compute[229246]: 2025-12-06 10:20:00.090 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:20:00 np0005548918 ceph-mon[75798]: overall HEALTH_OK
Dec  6 05:20:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:00 np0005548918 nova_compute[229246]: 2025-12-06 10:20:00.443 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:01 np0005548918 podman[250534]: 2025-12-06 10:20:01.222097 +0000 UTC m=+0.090726798 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:20:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:01.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:01.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:01 np0005548918 nova_compute[229246]: 2025-12-06 10:20:01.563 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:02 np0005548918 nova_compute[229246]: 2025-12-06 10:20:02.038 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:03.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:03.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:05.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:05 np0005548918 nova_compute[229246]: 2025-12-06 10:20:05.444 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.002000054s ======
Dec  6 05:20:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:05.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Dec  6 05:20:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:07 np0005548918 nova_compute[229246]: 2025-12-06 10:20:07.040 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:07.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:07.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:09.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:09.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:10 np0005548918 podman[250564]: 2025-12-06 10:20:10.210172739 +0000 UTC m=+0.085111925 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  6 05:20:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:10 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:20:10 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.5 total, 600.0 interval#012Cumulative writes: 6730 writes, 35K keys, 6730 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s#012Cumulative WAL: 6730 writes, 6730 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1548 writes, 8119 keys, 1548 commit groups, 1.0 writes per commit group, ingest: 18.02 MB, 0.03 MB/s#012Interval WAL: 1548 writes, 1548 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     50.4      1.00              0.14        18    0.055       0      0       0.0       0.0#012  L6      1/0   14.44 MB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   4.5    108.9     94.1      2.42              0.54        17    0.143     94K   9319       0.0       0.0#012 Sum      1/0   14.44 MB   0.0      0.3     0.0      0.2       0.3      0.1       0.0   5.5     77.2     81.4      3.42              0.69        35    0.098     94K   9319       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.9    123.5    126.6      0.56              0.14         8    0.069     26K   2592       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   0.0    108.9     94.1      2.42              0.54        17    0.143     94K   9319       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     79.0      0.64              0.14        17    0.037       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.36              0.00         1    0.360       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.5 total, 600.0 interval#012Flush(GB): cumulative 0.049, interval 0.012#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.27 GB write, 0.12 MB/s write, 0.26 GB read, 0.11 MB/s read, 3.4 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.11 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55784c777350#2 capacity: 304.00 MB usage: 22.89 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000234 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1381,22.15 MB,7.28483%) FilterBlock(35,279.92 KB,0.0899214%) IndexBlock(35,484.39 KB,0.155605%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  6 05:20:10 np0005548918 nova_compute[229246]: 2025-12-06 10:20:10.445 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:11.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:11.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:12 np0005548918 nova_compute[229246]: 2025-12-06 10:20:12.043 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:13.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:13.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:15.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:15 np0005548918 nova_compute[229246]: 2025-12-06 10:20:15.450 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:15.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:17 np0005548918 nova_compute[229246]: 2025-12-06 10:20:17.046 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:17.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:17.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:19.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:19.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:20 np0005548918 nova_compute[229246]: 2025-12-06 10:20:20.492 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:21.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:21.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:22 np0005548918 nova_compute[229246]: 2025-12-06 10:20:22.052 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:23.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:23.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:25.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:25 np0005548918 nova_compute[229246]: 2025-12-06 10:20:25.495 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:25.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:27 np0005548918 nova_compute[229246]: 2025-12-06 10:20:27.056 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:27 np0005548918 podman[250626]: 2025-12-06 10:20:27.226718503 +0000 UTC m=+0.105411880 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2)
Dec  6 05:20:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:27.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:27.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:29.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:29.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:30 np0005548918 nova_compute[229246]: 2025-12-06 10:20:30.497 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:20:31 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:20:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:31.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  6 05:20:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:31.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  6 05:20:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:32 np0005548918 nova_compute[229246]: 2025-12-06 10:20:32.059 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:32 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:20:32 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:20:32 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:20:32 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:20:32 np0005548918 podman[250808]: 2025-12-06 10:20:32.183699857 +0000 UTC m=+0.054665950 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  6 05:20:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:33.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:33.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:35.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:35 np0005548918 nova_compute[229246]: 2025-12-06 10:20:35.499 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:35.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:37 np0005548918 nova_compute[229246]: 2025-12-06 10:20:37.062 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:37.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:37 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:20:37 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:20:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:37.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:39.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:39.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:40 np0005548918 nova_compute[229246]: 2025-12-06 10:20:40.502 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:41 np0005548918 podman[250886]: 2025-12-06 10:20:41.194059816 +0000 UTC m=+0.071446629 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:20:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:41.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:41.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:42 np0005548918 nova_compute[229246]: 2025-12-06 10:20:42.066 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:43.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:43.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:45.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:45 np0005548918 nova_compute[229246]: 2025-12-06 10:20:45.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:45 np0005548918 nova_compute[229246]: 2025-12-06 10:20:45.535 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  6 05:20:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:45.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:45 np0005548918 nova_compute[229246]: 2025-12-06 10:20:45.560 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:45 np0005548918 nova_compute[229246]: 2025-12-06 10:20:45.575 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  6 05:20:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:47 np0005548918 nova_compute[229246]: 2025-12-06 10:20:47.068 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:47.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:47.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:48 np0005548918 nova_compute[229246]: 2025-12-06 10:20:48.576 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:49.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:49.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:50 np0005548918 nova_compute[229246]: 2025-12-06 10:20:50.562 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:51.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:51 np0005548918 nova_compute[229246]: 2025-12-06 10:20:51.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:51.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:52 np0005548918 nova_compute[229246]: 2025-12-06 10:20:52.070 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:53.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:53 np0005548918 nova_compute[229246]: 2025-12-06 10:20:53.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:53 np0005548918 nova_compute[229246]: 2025-12-06 10:20:53.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:53 np0005548918 nova_compute[229246]: 2025-12-06 10:20:53.537 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:20:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:53.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:20:53.691 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:20:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:20:53.691 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:20:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:20:53.691 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:20:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:54 np0005548918 nova_compute[229246]: 2025-12-06 10:20:54.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:20:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:55.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:55 np0005548918 nova_compute[229246]: 2025-12-06 10:20:55.549 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:55.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:55 np0005548918 nova_compute[229246]: 2025-12-06 10:20:55.597 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:56 np0005548918 nova_compute[229246]: 2025-12-06 10:20:56.530 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:56 np0005548918 nova_compute[229246]: 2025-12-06 10:20:56.551 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:57 np0005548918 nova_compute[229246]: 2025-12-06 10:20:57.073 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:20:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:57.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:20:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:57.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:20:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:58 np0005548918 podman[250948]: 2025-12-06 10:20:58.224757387 +0000 UTC m=+0.111912719 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  6 05:20:58 np0005548918 nova_compute[229246]: 2025-12-06 10:20:58.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:20:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:58 np0005548918 nova_compute[229246]: 2025-12-06 10:20:58.852 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:20:58 np0005548918 nova_compute[229246]: 2025-12-06 10:20:58.853 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:20:58 np0005548918 nova_compute[229246]: 2025-12-06 10:20:58.853 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:20:58 np0005548918 nova_compute[229246]: 2025-12-06 10:20:58.854 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:20:58 np0005548918 nova_compute[229246]: 2025-12-06 10:20:58.854 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:20:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:59 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:20:59 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4133591868' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:20:59 np0005548918 nova_compute[229246]: 2025-12-06 10:20:59.312 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:20:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:20:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:59.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:20:59 np0005548918 nova_compute[229246]: 2025-12-06 10:20:59.499 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:20:59 np0005548918 nova_compute[229246]: 2025-12-06 10:20:59.500 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4818MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:20:59 np0005548918 nova_compute[229246]: 2025-12-06 10:20:59.500 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:20:59 np0005548918 nova_compute[229246]: 2025-12-06 10:20:59.501 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:20:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:20:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  6 05:20:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:59.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  6 05:20:59 np0005548918 nova_compute[229246]: 2025-12-06 10:20:59.581 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:20:59 np0005548918 nova_compute[229246]: 2025-12-06 10:20:59.581 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:20:59 np0005548918 nova_compute[229246]: 2025-12-06 10:20:59.654 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:20:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:20:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:20:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:20:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:21:00 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2168834215' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:21:00 np0005548918 nova_compute[229246]: 2025-12-06 10:21:00.085 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:21:00 np0005548918 nova_compute[229246]: 2025-12-06 10:21:00.090 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:21:00 np0005548918 nova_compute[229246]: 2025-12-06 10:21:00.107 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:21:00 np0005548918 nova_compute[229246]: 2025-12-06 10:21:00.108 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:21:00 np0005548918 nova_compute[229246]: 2025-12-06 10:21:00.109 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:21:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:00 np0005548918 nova_compute[229246]: 2025-12-06 10:21:00.600 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:01 np0005548918 nova_compute[229246]: 2025-12-06 10:21:01.110 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:01 np0005548918 nova_compute[229246]: 2025-12-06 10:21:01.110 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:21:01 np0005548918 nova_compute[229246]: 2025-12-06 10:21:01.111 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:21:01 np0005548918 nova_compute[229246]: 2025-12-06 10:21:01.129 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:21:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:01.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:01.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:02 np0005548918 nova_compute[229246]: 2025-12-06 10:21:02.077 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:02 np0005548918 nova_compute[229246]: 2025-12-06 10:21:02.550 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:21:02 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 10K writes, 42K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 10K writes, 2969 syncs, 3.59 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2193 writes, 7965 keys, 2193 commit groups, 1.0 writes per commit group, ingest: 9.74 MB, 0.02 MB/s#012Interval WAL: 2193 writes, 896 syncs, 2.45 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  6 05:21:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:03 np0005548918 podman[251023]: 2025-12-06 10:21:03.160464507 +0000 UTC m=+0.049265030 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  6 05:21:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:03.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:03.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:05.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:05 np0005548918 nova_compute[229246]: 2025-12-06 10:21:05.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:05 np0005548918 nova_compute[229246]: 2025-12-06 10:21:05.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  6 05:21:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:05.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:05 np0005548918 nova_compute[229246]: 2025-12-06 10:21:05.601 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:07 np0005548918 nova_compute[229246]: 2025-12-06 10:21:07.079 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.138647) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467138700, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2725, "num_deletes": 508, "total_data_size": 6302188, "memory_usage": 6388640, "flush_reason": "Manual Compaction"}
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467176681, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 4104756, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33902, "largest_seqno": 36622, "table_properties": {"data_size": 4093704, "index_size": 6586, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 27198, "raw_average_key_size": 20, "raw_value_size": 4069181, "raw_average_value_size": 3023, "num_data_blocks": 283, "num_entries": 1346, "num_filter_entries": 1346, "num_deletions": 508, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016261, "oldest_key_time": 1765016261, "file_creation_time": 1765016467, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 38100 microseconds, and 16068 cpu microseconds.
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.176757) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 4104756 bytes OK
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.176781) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.178424) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.178442) EVENT_LOG_v1 {"time_micros": 1765016467178436, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.178462) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 6288869, prev total WAL file size 6288869, number of live WAL files 2.
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.180687) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(4008KB)], [63(14MB)]
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467180758, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 19241094, "oldest_snapshot_seqno": -1}
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6840 keys, 17003714 bytes, temperature: kUnknown
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467330000, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 17003714, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16956033, "index_size": 29457, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 176676, "raw_average_key_size": 25, "raw_value_size": 16831193, "raw_average_value_size": 2460, "num_data_blocks": 1183, "num_entries": 6840, "num_filter_entries": 6840, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765016467, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.330479) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 17003714 bytes
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.331961) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 128.8 rd, 113.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 14.4 +0.0 blob) out(16.2 +0.0 blob), read-write-amplify(8.8) write-amplify(4.1) OK, records in: 7875, records dropped: 1035 output_compression: NoCompression
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.332007) EVENT_LOG_v1 {"time_micros": 1765016467331980, "job": 38, "event": "compaction_finished", "compaction_time_micros": 149371, "compaction_time_cpu_micros": 42617, "output_level": 6, "num_output_files": 1, "total_output_size": 17003714, "num_input_records": 7875, "num_output_records": 6840, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467333540, "job": 38, "event": "table_file_deletion", "file_number": 65}
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467339060, "job": 38, "event": "table_file_deletion", "file_number": 63}
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.180564) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.339159) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.339168) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.339172) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.339176) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:07 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:07.339180) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:07.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:07.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:09.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:09.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:10 np0005548918 nova_compute[229246]: 2025-12-06 10:21:10.603 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:11.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:11.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:12 np0005548918 nova_compute[229246]: 2025-12-06 10:21:12.083 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:12 np0005548918 podman[251052]: 2025-12-06 10:21:12.189009766 +0000 UTC m=+0.075943592 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd)
Dec  6 05:21:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:13.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:13.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:15.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:15.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:15 np0005548918 nova_compute[229246]: 2025-12-06 10:21:15.605 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:17 np0005548918 nova_compute[229246]: 2025-12-06 10:21:17.086 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:17.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:17.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:19.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:19.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.037529) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480037648, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 363, "num_deletes": 251, "total_data_size": 331303, "memory_usage": 338344, "flush_reason": "Manual Compaction"}
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480043659, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 217819, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36627, "largest_seqno": 36985, "table_properties": {"data_size": 215646, "index_size": 337, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5932, "raw_average_key_size": 20, "raw_value_size": 211357, "raw_average_value_size": 721, "num_data_blocks": 15, "num_entries": 293, "num_filter_entries": 293, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016468, "oldest_key_time": 1765016468, "file_creation_time": 1765016480, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 6205 microseconds, and 2029 cpu microseconds.
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.043745) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 217819 bytes OK
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.043779) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.045815) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.045848) EVENT_LOG_v1 {"time_micros": 1765016480045837, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.045882) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 328871, prev total WAL file size 328871, number of live WAL files 2.
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.046685) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303033' seq:72057594037927935, type:22 .. '6D6772737461740031323535' seq:0, type:0; will stop at (end)
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(212KB)], [66(16MB)]
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480046738, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 17221533, "oldest_snapshot_seqno": -1}
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6623 keys, 13144993 bytes, temperature: kUnknown
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480164055, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 13144993, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13103579, "index_size": 23766, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16581, "raw_key_size": 172348, "raw_average_key_size": 26, "raw_value_size": 12987115, "raw_average_value_size": 1960, "num_data_blocks": 945, "num_entries": 6623, "num_filter_entries": 6623, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765016480, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.164504) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 13144993 bytes
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.165955) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 146.6 rd, 111.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 16.2 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(139.4) write-amplify(60.3) OK, records in: 7133, records dropped: 510 output_compression: NoCompression
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.165979) EVENT_LOG_v1 {"time_micros": 1765016480165967, "job": 40, "event": "compaction_finished", "compaction_time_micros": 117493, "compaction_time_cpu_micros": 49962, "output_level": 6, "num_output_files": 1, "total_output_size": 13144993, "num_input_records": 7133, "num_output_records": 6623, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480166234, "job": 40, "event": "table_file_deletion", "file_number": 68}
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480170357, "job": 40, "event": "table_file_deletion", "file_number": 66}
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.046566) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.170521) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.170530) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.170533) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.170536) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:21:20.170539) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:21:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:20 np0005548918 nova_compute[229246]: 2025-12-06 10:21:20.608 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:21.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:21.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:22 np0005548918 nova_compute[229246]: 2025-12-06 10:21:22.089 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:23.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:23.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:25.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:25.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:25 np0005548918 nova_compute[229246]: 2025-12-06 10:21:25.611 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:27 np0005548918 nova_compute[229246]: 2025-12-06 10:21:27.092 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:27.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:27.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:29 np0005548918 podman[251115]: 2025-12-06 10:21:29.222552297 +0000 UTC m=+0.099166930 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:21:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:29.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:29.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:30 np0005548918 nova_compute[229246]: 2025-12-06 10:21:30.613 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:31.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:31.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:31 np0005548918 nova_compute[229246]: 2025-12-06 10:21:31.689 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:32 np0005548918 nova_compute[229246]: 2025-12-06 10:21:32.095 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:33.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:33.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:34 np0005548918 podman[251147]: 2025-12-06 10:21:34.165964117 +0000 UTC m=+0.055431680 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Dec  6 05:21:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:35.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:35.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:35 np0005548918 nova_compute[229246]: 2025-12-06 10:21:35.614 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:37 np0005548918 nova_compute[229246]: 2025-12-06 10:21:37.098 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:37.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:37.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:38 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:21:38 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:21:38 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:21:38 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:21:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:39.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:39.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:40 np0005548918 nova_compute[229246]: 2025-12-06 10:21:40.615 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:41.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:41.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:42 np0005548918 nova_compute[229246]: 2025-12-06 10:21:42.101 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:42 np0005548918 podman[251305]: 2025-12-06 10:21:42.604299027 +0000 UTC m=+0.060222871 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:21:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:43 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:21:43 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:21:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:43.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:43.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:45.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:45 np0005548918 nova_compute[229246]: 2025-12-06 10:21:45.617 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:45.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  6 05:21:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2229648088' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  6 05:21:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  6 05:21:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2229648088' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  6 05:21:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:47 np0005548918 nova_compute[229246]: 2025-12-06 10:21:47.104 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:47.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:47.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:48 np0005548918 nova_compute[229246]: 2025-12-06 10:21:48.552 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:49.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:49.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:50 np0005548918 nova_compute[229246]: 2025-12-06 10:21:50.620 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  6 05:21:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:51.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  6 05:21:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:51.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:52 np0005548918 nova_compute[229246]: 2025-12-06 10:21:52.106 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:52 np0005548918 nova_compute[229246]: 2025-12-06 10:21:52.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:53.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:53.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:21:53.690 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:21:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:21:53.691 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:21:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:21:53.691 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:21:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:54 np0005548918 nova_compute[229246]: 2025-12-06 10:21:54.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:21:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:55.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:55 np0005548918 nova_compute[229246]: 2025-12-06 10:21:55.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:55 np0005548918 nova_compute[229246]: 2025-12-06 10:21:55.537 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:55 np0005548918 nova_compute[229246]: 2025-12-06 10:21:55.537 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:21:55 np0005548918 nova_compute[229246]: 2025-12-06 10:21:55.621 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:55.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:57 np0005548918 nova_compute[229246]: 2025-12-06 10:21:57.109 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:21:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:57.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:57.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:58 np0005548918 nova_compute[229246]: 2025-12-06 10:21:58.537 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:21:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:59.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:21:59 np0005548918 nova_compute[229246]: 2025-12-06 10:21:59.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:21:59 np0005548918 nova_compute[229246]: 2025-12-06 10:21:59.535 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:21:59 np0005548918 nova_compute[229246]: 2025-12-06 10:21:59.535 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:21:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:21:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:21:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:59.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:21:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:21:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:21:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:21:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:00 np0005548918 nova_compute[229246]: 2025-12-06 10:22:00.077 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:22:00 np0005548918 podman[251369]: 2025-12-06 10:22:00.236613751 +0000 UTC m=+0.126571850 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 05:22:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:00 np0005548918 nova_compute[229246]: 2025-12-06 10:22:00.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:00 np0005548918 nova_compute[229246]: 2025-12-06 10:22:00.623 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Dec  6 05:22:00 np0005548918 nova_compute[229246]: 2025-12-06 10:22:00.776 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:22:00 np0005548918 nova_compute[229246]: 2025-12-06 10:22:00.776 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:22:00 np0005548918 nova_compute[229246]: 2025-12-06 10:22:00.777 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:22:00 np0005548918 nova_compute[229246]: 2025-12-06 10:22:00.777 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:22:00 np0005548918 nova_compute[229246]: 2025-12-06 10:22:00.778 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:22:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Dec  6 05:22:00 np0005548918 radosgw[83463]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Dec  6 05:22:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:01 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:22:01 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3408085492' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:22:01 np0005548918 nova_compute[229246]: 2025-12-06 10:22:01.252 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:22:01 np0005548918 nova_compute[229246]: 2025-12-06 10:22:01.415 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:22:01 np0005548918 nova_compute[229246]: 2025-12-06 10:22:01.417 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4821MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:22:01 np0005548918 nova_compute[229246]: 2025-12-06 10:22:01.417 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:22:01 np0005548918 nova_compute[229246]: 2025-12-06 10:22:01.417 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:22:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:22:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:01.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:22:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:22:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:01.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:22:01 np0005548918 nova_compute[229246]: 2025-12-06 10:22:01.735 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:22:01 np0005548918 nova_compute[229246]: 2025-12-06 10:22:01.735 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:22:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:01 np0005548918 nova_compute[229246]: 2025-12-06 10:22:01.921 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing inventories for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  6 05:22:02 np0005548918 nova_compute[229246]: 2025-12-06 10:22:02.113 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:02 np0005548918 nova_compute[229246]: 2025-12-06 10:22:02.114 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating ProviderTree inventory for provider 31f5f484-bf36-44de-83b8-7b434061a77b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  6 05:22:02 np0005548918 nova_compute[229246]: 2025-12-06 10:22:02.115 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating inventory in ProviderTree for provider 31f5f484-bf36-44de-83b8-7b434061a77b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  6 05:22:02 np0005548918 nova_compute[229246]: 2025-12-06 10:22:02.162 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing aggregate associations for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  6 05:22:02 np0005548918 nova_compute[229246]: 2025-12-06 10:22:02.261 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing trait associations for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE4A,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_ABM,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,HW_CPU_X86_AVX2,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  6 05:22:02 np0005548918 nova_compute[229246]: 2025-12-06 10:22:02.323 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:22:02 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:22:02 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3417508859' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:22:02 np0005548918 nova_compute[229246]: 2025-12-06 10:22:02.772 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:22:02 np0005548918 nova_compute[229246]: 2025-12-06 10:22:02.779 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:22:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:02 np0005548918 nova_compute[229246]: 2025-12-06 10:22:02.877 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:22:02 np0005548918 nova_compute[229246]: 2025-12-06 10:22:02.879 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:22:02 np0005548918 nova_compute[229246]: 2025-12-06 10:22:02.879 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.462s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:22:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:22:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:03.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:22:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:22:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:03.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:22:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:04 np0005548918 nova_compute[229246]: 2025-12-06 10:22:04.875 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:05 np0005548918 podman[251445]: 2025-12-06 10:22:05.181607135 +0000 UTC m=+0.063347312 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Dec  6 05:22:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:05.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:05 np0005548918 nova_compute[229246]: 2025-12-06 10:22:05.625 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:05.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:07 np0005548918 nova_compute[229246]: 2025-12-06 10:22:07.115 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:07.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:07.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:09.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:09.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:10 np0005548918 nova_compute[229246]: 2025-12-06 10:22:10.646 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:22:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:11.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:22:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:11.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:12 np0005548918 nova_compute[229246]: 2025-12-06 10:22:12.119 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:13 np0005548918 podman[251473]: 2025-12-06 10:22:13.189304911 +0000 UTC m=+0.066343022 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec  6 05:22:13 np0005548918 nova_compute[229246]: 2025-12-06 10:22:13.428 229250 DEBUG oslo_concurrency.processutils [None req-1b326720-1719-4a67-9e7f-ab0eb7cb97ad bcb29c3303b24519a22c267aaed79458 3e0ab101ca7547d4a515169a0f2edef3 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:22:13 np0005548918 nova_compute[229246]: 2025-12-06 10:22:13.459 229250 DEBUG oslo_concurrency.processutils [None req-1b326720-1719-4a67-9e7f-ab0eb7cb97ad bcb29c3303b24519a22c267aaed79458 3e0ab101ca7547d4a515169a0f2edef3 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:22:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:22:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:13.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:22:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:13.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:15.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:15 np0005548918 nova_compute[229246]: 2025-12-06 10:22:15.647 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:15.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:17 np0005548918 nova_compute[229246]: 2025-12-06 10:22:17.121 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:17.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:17.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:19.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:19.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:19 np0005548918 nova_compute[229246]: 2025-12-06 10:22:19.808 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:19 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:22:19.808 141640 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  6 05:22:19 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:22:19.809 141640 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  6 05:22:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:20 np0005548918 nova_compute[229246]: 2025-12-06 10:22:20.648 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:21.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:21.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:22 np0005548918 nova_compute[229246]: 2025-12-06 10:22:22.163 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:22 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:22:22.810 141640 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=1b31b208-e0d4-490d-9f30-552f5575d012, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  6 05:22:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:23.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:23.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:25.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:25 np0005548918 nova_compute[229246]: 2025-12-06 10:22:25.650 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:25.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:27 np0005548918 nova_compute[229246]: 2025-12-06 10:22:27.166 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:22:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:27.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:22:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:27.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:22:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:29.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:22:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:22:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:29.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:22:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:30 np0005548918 nova_compute[229246]: 2025-12-06 10:22:30.658 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:31 np0005548918 podman[251538]: 2025-12-06 10:22:31.235975732 +0000 UTC m=+0.117068483 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  6 05:22:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:31.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:31.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:32 np0005548918 nova_compute[229246]: 2025-12-06 10:22:32.168 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:33.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:22:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:33.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:22:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:35.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:35 np0005548918 nova_compute[229246]: 2025-12-06 10:22:35.660 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:35.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:36 np0005548918 podman[251569]: 2025-12-06 10:22:36.202491978 +0000 UTC m=+0.083543238 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec  6 05:22:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:37 np0005548918 nova_compute[229246]: 2025-12-06 10:22:37.171 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:37.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:37.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:22:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:39.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:22:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:39.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:40 np0005548918 nova_compute[229246]: 2025-12-06 10:22:40.663 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:22:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:41.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:22:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:41.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:42 np0005548918 nova_compute[229246]: 2025-12-06 10:22:42.174 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:43 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:22:43 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:22:43 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:22:43 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:22:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:43.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:43.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:44 np0005548918 podman[251703]: 2025-12-06 10:22:44.204520702 +0000 UTC m=+0.088261995 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec  6 05:22:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:22:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:45.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:22:45 np0005548918 nova_compute[229246]: 2025-12-06 10:22:45.665 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:45.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:47 np0005548918 nova_compute[229246]: 2025-12-06 10:22:47.177 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:47.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:47.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:48 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:22:48 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:22:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:22:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:49.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:22:49 np0005548918 nova_compute[229246]: 2025-12-06 10:22:49.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:49.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:50 np0005548918 nova_compute[229246]: 2025-12-06 10:22:50.667 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:51.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:51.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:52 np0005548918 nova_compute[229246]: 2025-12-06 10:22:52.180 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:22:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:53.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:22:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:22:53.692 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:22:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:22:53.692 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:22:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:22:53.692 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:22:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:53.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:54 np0005548918 nova_compute[229246]: 2025-12-06 10:22:54.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:54 np0005548918 nova_compute[229246]: 2025-12-06 10:22:54.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:22:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:55.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:55 np0005548918 nova_compute[229246]: 2025-12-06 10:22:55.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:55 np0005548918 nova_compute[229246]: 2025-12-06 10:22:55.535 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:22:55 np0005548918 nova_compute[229246]: 2025-12-06 10:22:55.670 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:55.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:57 np0005548918 nova_compute[229246]: 2025-12-06 10:22:57.184 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:22:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:57.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:57 np0005548918 nova_compute[229246]: 2025-12-06 10:22:57.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:57.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:22:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:59.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:22:59 np0005548918 nova_compute[229246]: 2025-12-06 10:22:59.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:59 np0005548918 nova_compute[229246]: 2025-12-06 10:22:59.537 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:22:59 np0005548918 nova_compute[229246]: 2025-12-06 10:22:59.537 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:22:59 np0005548918 nova_compute[229246]: 2025-12-06 10:22:59.556 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:22:59 np0005548918 nova_compute[229246]: 2025-12-06 10:22:59.557 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:22:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:22:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:22:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:59.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:22:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:22:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:22:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:22:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:00 np0005548918 nova_compute[229246]: 2025-12-06 10:23:00.671 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:01.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:01 np0005548918 nova_compute[229246]: 2025-12-06 10:23:01.551 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:01.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:02 np0005548918 nova_compute[229246]: 2025-12-06 10:23:02.188 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:02 np0005548918 podman[251792]: 2025-12-06 10:23:02.212928329 +0000 UTC m=+0.100106565 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:23:02 np0005548918 nova_compute[229246]: 2025-12-06 10:23:02.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:02 np0005548918 nova_compute[229246]: 2025-12-06 10:23:02.537 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:02 np0005548918 nova_compute[229246]: 2025-12-06 10:23:02.567 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:23:02 np0005548918 nova_compute[229246]: 2025-12-06 10:23:02.568 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:23:02 np0005548918 nova_compute[229246]: 2025-12-06 10:23:02.568 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:23:02 np0005548918 nova_compute[229246]: 2025-12-06 10:23:02.569 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:23:02 np0005548918 nova_compute[229246]: 2025-12-06 10:23:02.569 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:23:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:03 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:23:03 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2126492916' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:23:03 np0005548918 nova_compute[229246]: 2025-12-06 10:23:03.040 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:23:03 np0005548918 nova_compute[229246]: 2025-12-06 10:23:03.226 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:23:03 np0005548918 nova_compute[229246]: 2025-12-06 10:23:03.227 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4851MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:23:03 np0005548918 nova_compute[229246]: 2025-12-06 10:23:03.227 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:23:03 np0005548918 nova_compute[229246]: 2025-12-06 10:23:03.227 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:23:03 np0005548918 nova_compute[229246]: 2025-12-06 10:23:03.316 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:23:03 np0005548918 nova_compute[229246]: 2025-12-06 10:23:03.317 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:23:03 np0005548918 nova_compute[229246]: 2025-12-06 10:23:03.344 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:23:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.002000053s ======
Dec  6 05:23:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:03.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Dec  6 05:23:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:03.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:03 np0005548918 nova_compute[229246]: 2025-12-06 10:23:03.837 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:23:03 np0005548918 nova_compute[229246]: 2025-12-06 10:23:03.843 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:23:03 np0005548918 nova_compute[229246]: 2025-12-06 10:23:03.877 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:23:03 np0005548918 nova_compute[229246]: 2025-12-06 10:23:03.879 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:23:03 np0005548918 nova_compute[229246]: 2025-12-06 10:23:03.879 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:23:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:05.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:05 np0005548918 nova_compute[229246]: 2025-12-06 10:23:05.674 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:05.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:07 np0005548918 podman[251868]: 2025-12-06 10:23:07.183584426 +0000 UTC m=+0.066496655 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  6 05:23:07 np0005548918 nova_compute[229246]: 2025-12-06 10:23:07.190 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:23:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:07.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:23:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:07.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:09.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:09.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:10 np0005548918 nova_compute[229246]: 2025-12-06 10:23:10.677 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:23:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:11.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:23:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:23:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:11.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:23:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:12 np0005548918 nova_compute[229246]: 2025-12-06 10:23:12.219 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:23:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:13.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:23:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:13.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:15 np0005548918 podman[251896]: 2025-12-06 10:23:15.205879526 +0000 UTC m=+0.084036500 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec  6 05:23:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:15.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:15 np0005548918 nova_compute[229246]: 2025-12-06 10:23:15.680 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:15.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:17 np0005548918 nova_compute[229246]: 2025-12-06 10:23:17.256 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:17.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:17.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:23:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:19.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:23:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:19.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:20 np0005548918 nova_compute[229246]: 2025-12-06 10:23:20.682 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:21.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:21.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:22 np0005548918 nova_compute[229246]: 2025-12-06 10:23:22.322 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:23.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:23.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:23:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:25.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:23:25 np0005548918 nova_compute[229246]: 2025-12-06 10:23:25.684 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:25.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:27 np0005548918 nova_compute[229246]: 2025-12-06 10:23:27.325 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:23:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:27.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:23:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:27.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:29.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:29.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:30 np0005548918 nova_compute[229246]: 2025-12-06 10:23:30.687 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:31.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:31.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:32 np0005548918 nova_compute[229246]: 2025-12-06 10:23:32.328 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:33 np0005548918 podman[251959]: 2025-12-06 10:23:33.195151239 +0000 UTC m=+0.079566821 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:23:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:33.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:33.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:23:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:35.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:23:35 np0005548918 nova_compute[229246]: 2025-12-06 10:23:35.689 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:35.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:37 np0005548918 podman[252014]: 2025-12-06 10:23:37.301967069 +0000 UTC m=+0.040430134 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  6 05:23:37 np0005548918 nova_compute[229246]: 2025-12-06 10:23:37.383 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:37.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:37.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:23:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:39.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:23:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:39.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:40 np0005548918 nova_compute[229246]: 2025-12-06 10:23:40.692 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:23:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:41.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:23:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:41.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:42 np0005548918 nova_compute[229246]: 2025-12-06 10:23:42.418 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:43.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:43.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:45.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:45 np0005548918 nova_compute[229246]: 2025-12-06 10:23:45.693 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:45.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:46 np0005548918 podman[252042]: 2025-12-06 10:23:46.168952122 +0000 UTC m=+0.059731564 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  6 05:23:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:47 np0005548918 nova_compute[229246]: 2025-12-06 10:23:47.459 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:23:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:47.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:23:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:47.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:23:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:49.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:23:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:49.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:49 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:23:49 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:23:49 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:23:49 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.090933) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630090966, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 1706, "num_deletes": 250, "total_data_size": 4188119, "memory_usage": 4252560, "flush_reason": "Manual Compaction"}
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630112986, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 2747624, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36990, "largest_seqno": 38691, "table_properties": {"data_size": 2740571, "index_size": 4060, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 13926, "raw_average_key_size": 18, "raw_value_size": 2726508, "raw_average_value_size": 3669, "num_data_blocks": 178, "num_entries": 743, "num_filter_entries": 743, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016481, "oldest_key_time": 1765016481, "file_creation_time": 1765016630, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 22097 microseconds, and 10366 cpu microseconds.
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.113029) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 2747624 bytes OK
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.113048) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.114543) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.114554) EVENT_LOG_v1 {"time_micros": 1765016630114550, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.114569) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 4180307, prev total WAL file size 4180307, number of live WAL files 2.
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.115659) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323533' seq:72057594037927935, type:22 .. '6B7600353034' seq:0, type:0; will stop at (end)
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(2683KB)], [69(12MB)]
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630115740, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 15892617, "oldest_snapshot_seqno": -1}
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6852 keys, 14489174 bytes, temperature: kUnknown
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630236295, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 14489174, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14444865, "index_size": 26085, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 178801, "raw_average_key_size": 26, "raw_value_size": 14322959, "raw_average_value_size": 2090, "num_data_blocks": 1032, "num_entries": 6852, "num_filter_entries": 6852, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765016630, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.236675) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 14489174 bytes
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.237930) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 131.9 rd, 120.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 12.5 +0.0 blob) out(13.8 +0.0 blob), read-write-amplify(11.1) write-amplify(5.3) OK, records in: 7366, records dropped: 514 output_compression: NoCompression
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.237951) EVENT_LOG_v1 {"time_micros": 1765016630237941, "job": 42, "event": "compaction_finished", "compaction_time_micros": 120469, "compaction_time_cpu_micros": 53834, "output_level": 6, "num_output_files": 1, "total_output_size": 14489174, "num_input_records": 7366, "num_output_records": 6852, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630238697, "job": 42, "event": "table_file_deletion", "file_number": 71}
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630241787, "job": 42, "event": "table_file_deletion", "file_number": 69}
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.115415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.241873) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.241880) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.241883) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.241886) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:23:50.241888) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:23:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:50 np0005548918 nova_compute[229246]: 2025-12-06 10:23:50.694 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:23:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:51.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:23:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:23:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:51.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:23:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:52 np0005548918 nova_compute[229246]: 2025-12-06 10:23:52.465 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:52 np0005548918 nova_compute[229246]: 2025-12-06 10:23:52.878 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:53.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:23:53.694 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:23:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:23:53.695 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:23:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:23:53.695 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:23:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:53.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:55 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:23:55 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:23:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:23:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:23:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:55.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:23:55 np0005548918 nova_compute[229246]: 2025-12-06 10:23:55.696 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:55.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:56 np0005548918 nova_compute[229246]: 2025-12-06 10:23:56.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:56 np0005548918 nova_compute[229246]: 2025-12-06 10:23:56.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:56 np0005548918 nova_compute[229246]: 2025-12-06 10:23:56.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:56 np0005548918 nova_compute[229246]: 2025-12-06 10:23:56.537 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:23:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:57 np0005548918 nova_compute[229246]: 2025-12-06 10:23:57.503 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:23:57 np0005548918 nova_compute[229246]: 2025-12-06 10:23:57.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:57.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:57.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:59 np0005548918 nova_compute[229246]: 2025-12-06 10:23:59.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:23:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:59.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:23:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:23:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:59.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:23:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:23:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:23:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:23:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:00 np0005548918 nova_compute[229246]: 2025-12-06 10:24:00.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:00 np0005548918 nova_compute[229246]: 2025-12-06 10:24:00.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:24:00 np0005548918 nova_compute[229246]: 2025-12-06 10:24:00.537 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:24:00 np0005548918 nova_compute[229246]: 2025-12-06 10:24:00.555 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:24:00 np0005548918 nova_compute[229246]: 2025-12-06 10:24:00.699 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:01.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:01.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:02 np0005548918 nova_compute[229246]: 2025-12-06 10:24:02.505 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:03 np0005548918 nova_compute[229246]: 2025-12-06 10:24:03.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:03 np0005548918 nova_compute[229246]: 2025-12-06 10:24:03.552 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:24:03 np0005548918 nova_compute[229246]: 2025-12-06 10:24:03.553 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:24:03 np0005548918 nova_compute[229246]: 2025-12-06 10:24:03.553 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:24:03 np0005548918 nova_compute[229246]: 2025-12-06 10:24:03.553 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:24:03 np0005548918 nova_compute[229246]: 2025-12-06 10:24:03.553 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:24:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:03.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:03.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:03 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:24:03 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/747380932' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:24:03 np0005548918 nova_compute[229246]: 2025-12-06 10:24:03.973 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:24:04 np0005548918 nova_compute[229246]: 2025-12-06 10:24:04.122 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:24:04 np0005548918 nova_compute[229246]: 2025-12-06 10:24:04.123 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4822MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:24:04 np0005548918 nova_compute[229246]: 2025-12-06 10:24:04.124 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:24:04 np0005548918 nova_compute[229246]: 2025-12-06 10:24:04.124 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:24:04 np0005548918 podman[252234]: 2025-12-06 10:24:04.189174306 +0000 UTC m=+0.072331213 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251125)
Dec  6 05:24:04 np0005548918 nova_compute[229246]: 2025-12-06 10:24:04.196 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:24:04 np0005548918 nova_compute[229246]: 2025-12-06 10:24:04.197 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:24:04 np0005548918 nova_compute[229246]: 2025-12-06 10:24:04.211 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:24:04 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:24:04 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/962126688' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:24:04 np0005548918 nova_compute[229246]: 2025-12-06 10:24:04.636 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:24:04 np0005548918 nova_compute[229246]: 2025-12-06 10:24:04.643 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:24:04 np0005548918 nova_compute[229246]: 2025-12-06 10:24:04.668 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:24:04 np0005548918 nova_compute[229246]: 2025-12-06 10:24:04.670 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:24:04 np0005548918 nova_compute[229246]: 2025-12-06 10:24:04.671 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:24:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:24:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:05.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:24:05 np0005548918 nova_compute[229246]: 2025-12-06 10:24:05.666 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:05 np0005548918 nova_compute[229246]: 2025-12-06 10:24:05.702 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:24:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:05.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:24:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:07 np0005548918 nova_compute[229246]: 2025-12-06 10:24:07.508 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:24:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:07.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:24:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:24:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:07.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:24:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:08 np0005548918 podman[252286]: 2025-12-06 10:24:08.207325152 +0000 UTC m=+0.084297307 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec  6 05:24:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:24:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:09.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:24:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:24:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:09.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:24:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:10 np0005548918 nova_compute[229246]: 2025-12-06 10:24:10.704 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:11.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:24:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:11.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:24:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:12 np0005548918 nova_compute[229246]: 2025-12-06 10:24:12.511 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:13.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:24:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:13.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:24:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:24:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:15.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:24:15 np0005548918 nova_compute[229246]: 2025-12-06 10:24:15.706 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:15.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:17 np0005548918 podman[252314]: 2025-12-06 10:24:17.1612273 +0000 UTC m=+0.050577017 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Dec  6 05:24:17 np0005548918 nova_compute[229246]: 2025-12-06 10:24:17.557 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:17.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:17.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:24:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:19.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:24:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:19.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:20 np0005548918 nova_compute[229246]: 2025-12-06 10:24:20.708 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:21.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:21.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:22 np0005548918 nova_compute[229246]: 2025-12-06 10:24:22.559 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:23.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:24:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:23.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:24:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:24:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:25.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:24:25 np0005548918 nova_compute[229246]: 2025-12-06 10:24:25.770 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:24:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:25.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:24:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:27 np0005548918 nova_compute[229246]: 2025-12-06 10:24:27.563 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.002000053s ======
Dec  6 05:24:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:27.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Dec  6 05:24:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:24:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:27.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:24:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:29.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:24:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:29.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:24:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:30 np0005548918 nova_compute[229246]: 2025-12-06 10:24:30.783 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:31.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:31.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:32 np0005548918 nova_compute[229246]: 2025-12-06 10:24:32.565 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:33.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:33.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:35 np0005548918 podman[252378]: 2025-12-06 10:24:35.204762315 +0000 UTC m=+0.089817136 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:24:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:24:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:35.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:24:35 np0005548918 nova_compute[229246]: 2025-12-06 10:24:35.786 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:24:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:35.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:24:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:37 np0005548918 nova_compute[229246]: 2025-12-06 10:24:37.567 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:37.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:37.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:39 np0005548918 podman[252436]: 2025-12-06 10:24:39.195064087 +0000 UTC m=+0.071912743 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec  6 05:24:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:39.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:24:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:39.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:24:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:40 np0005548918 nova_compute[229246]: 2025-12-06 10:24:40.790 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:41.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:41.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:42 np0005548918 nova_compute[229246]: 2025-12-06 10:24:42.569 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.215771) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683215825, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 759, "num_deletes": 251, "total_data_size": 1539291, "memory_usage": 1565000, "flush_reason": "Manual Compaction"}
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683227156, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 1017152, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38696, "largest_seqno": 39450, "table_properties": {"data_size": 1013510, "index_size": 1486, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8250, "raw_average_key_size": 19, "raw_value_size": 1006253, "raw_average_value_size": 2362, "num_data_blocks": 65, "num_entries": 426, "num_filter_entries": 426, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016631, "oldest_key_time": 1765016631, "file_creation_time": 1765016683, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 11431 microseconds, and 3249 cpu microseconds.
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.227205) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 1017152 bytes OK
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.227220) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.228703) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.228713) EVENT_LOG_v1 {"time_micros": 1765016683228710, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.228726) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 1535275, prev total WAL file size 1535275, number of live WAL files 2.
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.229253) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(993KB)], [72(13MB)]
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683229316, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 15506326, "oldest_snapshot_seqno": -1}
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6764 keys, 13325449 bytes, temperature: kUnknown
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683315711, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 13325449, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13282836, "index_size": 24581, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16965, "raw_key_size": 177640, "raw_average_key_size": 26, "raw_value_size": 13163517, "raw_average_value_size": 1946, "num_data_blocks": 963, "num_entries": 6764, "num_filter_entries": 6764, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765016683, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.316060) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 13325449 bytes
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.334647) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 179.2 rd, 154.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 13.8 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(28.3) write-amplify(13.1) OK, records in: 7278, records dropped: 514 output_compression: NoCompression
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.334681) EVENT_LOG_v1 {"time_micros": 1765016683334669, "job": 44, "event": "compaction_finished", "compaction_time_micros": 86529, "compaction_time_cpu_micros": 25814, "output_level": 6, "num_output_files": 1, "total_output_size": 13325449, "num_input_records": 7278, "num_output_records": 6764, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683334973, "job": 44, "event": "table_file_deletion", "file_number": 74}
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683337342, "job": 44, "event": "table_file_deletion", "file_number": 72}
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.229143) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.337407) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.337416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.337418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.337420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:24:43 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:24:43.337422) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:24:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:43.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:24:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:43.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:24:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:45.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:45 np0005548918 nova_compute[229246]: 2025-12-06 10:24:45.791 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:45.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  6 05:24:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/120415646' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  6 05:24:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  6 05:24:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/120415646' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  6 05:24:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:47 np0005548918 nova_compute[229246]: 2025-12-06 10:24:47.571 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:47.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:47.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:48 np0005548918 podman[252465]: 2025-12-06 10:24:48.183019117 +0000 UTC m=+0.060646339 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:24:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:49.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:49.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:50 np0005548918 nova_compute[229246]: 2025-12-06 10:24:50.792 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:51 np0005548918 nova_compute[229246]: 2025-12-06 10:24:51.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:51.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:51.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:52 np0005548918 nova_compute[229246]: 2025-12-06 10:24:52.573 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:53.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:24:53.696 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:24:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:24:53.696 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:24:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:24:53.696 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:24:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:53.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:24:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:55.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:55 np0005548918 nova_compute[229246]: 2025-12-06 10:24:55.829 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:55.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:56 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  6 05:24:56 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:24:56 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:24:56 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:24:56 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:24:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:57 np0005548918 nova_compute[229246]: 2025-12-06 10:24:57.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:57 np0005548918 nova_compute[229246]: 2025-12-06 10:24:57.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:57 np0005548918 nova_compute[229246]: 2025-12-06 10:24:57.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:57 np0005548918 nova_compute[229246]: 2025-12-06 10:24:57.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:24:57 np0005548918 nova_compute[229246]: 2025-12-06 10:24:57.575 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:24:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:24:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:57.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:24:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:57.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:58 np0005548918 nova_compute[229246]: 2025-12-06 10:24:58.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:24:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:24:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:24:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:24:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:59.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:24:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:24:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:24:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:59.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:24:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:24:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:00 np0005548918 nova_compute[229246]: 2025-12-06 10:25:00.832 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:01 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:25:01 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:25:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:01 np0005548918 nova_compute[229246]: 2025-12-06 10:25:01.532 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:01 np0005548918 nova_compute[229246]: 2025-12-06 10:25:01.554 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:01.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:01.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:02 np0005548918 nova_compute[229246]: 2025-12-06 10:25:02.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:02 np0005548918 nova_compute[229246]: 2025-12-06 10:25:02.535 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:25:02 np0005548918 nova_compute[229246]: 2025-12-06 10:25:02.535 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:25:02 np0005548918 nova_compute[229246]: 2025-12-06 10:25:02.555 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:25:02 np0005548918 nova_compute[229246]: 2025-12-06 10:25:02.578 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:03.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:25:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:03.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:25:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:04 np0005548918 nova_compute[229246]: 2025-12-06 10:25:04.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:04 np0005548918 nova_compute[229246]: 2025-12-06 10:25:04.669 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:25:04 np0005548918 nova_compute[229246]: 2025-12-06 10:25:04.670 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:25:04 np0005548918 nova_compute[229246]: 2025-12-06 10:25:04.670 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:25:04 np0005548918 nova_compute[229246]: 2025-12-06 10:25:04.670 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:25:04 np0005548918 nova_compute[229246]: 2025-12-06 10:25:04.670 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:25:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:25:05 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3124201791' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:25:05 np0005548918 nova_compute[229246]: 2025-12-06 10:25:05.139 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:25:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:05 np0005548918 nova_compute[229246]: 2025-12-06 10:25:05.286 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:25:05 np0005548918 nova_compute[229246]: 2025-12-06 10:25:05.287 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4832MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:25:05 np0005548918 nova_compute[229246]: 2025-12-06 10:25:05.288 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:25:05 np0005548918 nova_compute[229246]: 2025-12-06 10:25:05.288 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:25:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:05 np0005548918 nova_compute[229246]: 2025-12-06 10:25:05.401 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:25:05 np0005548918 nova_compute[229246]: 2025-12-06 10:25:05.402 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:25:05 np0005548918 nova_compute[229246]: 2025-12-06 10:25:05.426 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:25:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:05.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:05 np0005548918 nova_compute[229246]: 2025-12-06 10:25:05.834 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:25:05 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2298643697' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:25:05 np0005548918 nova_compute[229246]: 2025-12-06 10:25:05.885 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:25:05 np0005548918 nova_compute[229246]: 2025-12-06 10:25:05.892 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:25:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:05.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:05 np0005548918 nova_compute[229246]: 2025-12-06 10:25:05.910 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:25:05 np0005548918 nova_compute[229246]: 2025-12-06 10:25:05.912 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:25:05 np0005548918 nova_compute[229246]: 2025-12-06 10:25:05.912 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:25:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:06 np0005548918 podman[252679]: 2025-12-06 10:25:06.262343818 +0000 UTC m=+0.135601083 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:25:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:07 np0005548918 nova_compute[229246]: 2025-12-06 10:25:07.580 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:07.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:07.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:07 np0005548918 nova_compute[229246]: 2025-12-06 10:25:07.909 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:25:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:09.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:25:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:25:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:09.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:25:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:10 np0005548918 podman[252709]: 2025-12-06 10:25:10.185759384 +0000 UTC m=+0.075125900 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:25:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:10 np0005548918 nova_compute[229246]: 2025-12-06 10:25:10.838 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:11.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:11.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:12 np0005548918 nova_compute[229246]: 2025-12-06 10:25:12.582 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:25:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:13.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:25:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:13.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:15.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:15 np0005548918 nova_compute[229246]: 2025-12-06 10:25:15.839 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:15.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:17 np0005548918 nova_compute[229246]: 2025-12-06 10:25:17.585 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:17.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:17.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:19 np0005548918 podman[252763]: 2025-12-06 10:25:19.204592407 +0000 UTC m=+0.083101944 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2)
Dec  6 05:25:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:19.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:25:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:19.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:25:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:20 np0005548918 nova_compute[229246]: 2025-12-06 10:25:20.842 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:21.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:21.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:22 np0005548918 nova_compute[229246]: 2025-12-06 10:25:22.589 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:23.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:23.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:25.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:25 np0005548918 nova_compute[229246]: 2025-12-06 10:25:25.846 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:25.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:27 np0005548918 nova_compute[229246]: 2025-12-06 10:25:27.591 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:27.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:27.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:29.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:29.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:30 np0005548918 nova_compute[229246]: 2025-12-06 10:25:30.848 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:31.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:31.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:32 np0005548918 nova_compute[229246]: 2025-12-06 10:25:32.594 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:25:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:33.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:25:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:33.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:35.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:35 np0005548918 nova_compute[229246]: 2025-12-06 10:25:35.883 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:35.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:37 np0005548918 podman[252802]: 2025-12-06 10:25:37.298450823 +0000 UTC m=+0.170657830 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:25:37 np0005548918 nova_compute[229246]: 2025-12-06 10:25:37.596 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:37.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:37.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:39.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:39.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:40 np0005548918 nova_compute[229246]: 2025-12-06 10:25:40.915 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:41 np0005548918 podman[252857]: 2025-12-06 10:25:41.167962144 +0000 UTC m=+0.047641288 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  6 05:25:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:41.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:41.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:42 np0005548918 nova_compute[229246]: 2025-12-06 10:25:42.598 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:43.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:25:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:43.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:25:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:25:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:45.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:25:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:45.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:45 np0005548918 nova_compute[229246]: 2025-12-06 10:25:45.963 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:47 np0005548918 nova_compute[229246]: 2025-12-06 10:25:47.601 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:47.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:47.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:49.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:49.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:50 np0005548918 podman[252885]: 2025-12-06 10:25:50.156022505 +0000 UTC m=+0.045679754 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  6 05:25:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:50 np0005548918 nova_compute[229246]: 2025-12-06 10:25:50.990 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:51.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:51.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:52 np0005548918 nova_compute[229246]: 2025-12-06 10:25:52.605 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:53 np0005548918 nova_compute[229246]: 2025-12-06 10:25:53.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:25:53.697 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:25:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:25:53.698 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:25:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:25:53.698 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:25:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:53.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:53.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:54 np0005548918 nova_compute[229246]: 2025-12-06 10:25:54.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:54 np0005548918 nova_compute[229246]: 2025-12-06 10:25:54.537 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  6 05:25:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:55 np0005548918 nova_compute[229246]: 2025-12-06 10:25:55.316 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  6 05:25:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:25:55 np0005548918 nova_compute[229246]: 2025-12-06 10:25:55.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:55.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:55.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:56 np0005548918 nova_compute[229246]: 2025-12-06 10:25:56.034 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:57 np0005548918 nova_compute[229246]: 2025-12-06 10:25:57.549 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:57 np0005548918 nova_compute[229246]: 2025-12-06 10:25:57.550 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:57 np0005548918 nova_compute[229246]: 2025-12-06 10:25:57.551 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:57 np0005548918 nova_compute[229246]: 2025-12-06 10:25:57.551 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:25:57 np0005548918 nova_compute[229246]: 2025-12-06 10:25:57.607 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:25:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:57.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:25:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:25:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:57.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:25:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:25:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:59 np0005548918 nova_compute[229246]: 2025-12-06 10:25:59.538 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:25:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:25:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:59.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:25:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:25:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:25:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:25:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:25:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:59.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:01 np0005548918 nova_compute[229246]: 2025-12-06 10:26:01.250 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:01 np0005548918 nova_compute[229246]: 2025-12-06 10:26:01.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:01.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:01 np0005548918 podman[253065]: 2025-12-06 10:26:01.881700948 +0000 UTC m=+0.053297600 container exec 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Dec  6 05:26:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:01.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:01 np0005548918 podman[253065]: 2025-12-06 10:26:01.982665945 +0000 UTC m=+0.154262587 container exec_died 9800312b2542fe0693288675ab107c9c61698c742f14979a86f6cba5b2aa9684 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-2, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  6 05:26:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:02 np0005548918 podman[253184]: 2025-12-06 10:26:02.47794795 +0000 UTC m=+0.069790305 container exec 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 05:26:02 np0005548918 nova_compute[229246]: 2025-12-06 10:26:02.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:02 np0005548918 nova_compute[229246]: 2025-12-06 10:26:02.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:26:02 np0005548918 nova_compute[229246]: 2025-12-06 10:26:02.537 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:26:02 np0005548918 podman[253209]: 2025-12-06 10:26:02.544338823 +0000 UTC m=+0.050756552 container exec_died 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 05:26:02 np0005548918 podman[253184]: 2025-12-06 10:26:02.550333675 +0000 UTC m=+0.142176020 container exec_died 323c7317ccdb5f3560897f17d8d0f7f3c36e4427dab596acc2e4717dd220186b (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  6 05:26:02 np0005548918 nova_compute[229246]: 2025-12-06 10:26:02.556 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:26:02 np0005548918 nova_compute[229246]: 2025-12-06 10:26:02.608 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:03 np0005548918 podman[253322]: 2025-12-06 10:26:03.191938182 +0000 UTC m=+0.086083375 container exec 291e33d7558df1250bc1d75586903aba6000ccad9dd3cb120f4999944db31c98 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna)
Dec  6 05:26:03 np0005548918 podman[253322]: 2025-12-06 10:26:03.203607868 +0000 UTC m=+0.097753051 container exec_died 291e33d7558df1250bc1d75586903aba6000ccad9dd3cb120f4999944db31c98 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-2-voodna)
Dec  6 05:26:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:03 np0005548918 podman[253387]: 2025-12-06 10:26:03.437463903 +0000 UTC m=+0.058762938 container exec cbcabdb9b139bf7198b15438accb8f4a51fb667fdf4f19be3cdf7b28a8213220 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1793, description=keepalived for Ceph, vcs-type=git, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, version=2.2.4, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc.)
Dec  6 05:26:03 np0005548918 podman[253387]: 2025-12-06 10:26:03.447826493 +0000 UTC m=+0.069125518 container exec_died cbcabdb9b139bf7198b15438accb8f4a51fb667fdf4f19be3cdf7b28a8213220 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg, distribution-scope=public, io.openshift.tags=Ceph keepalived, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, com.redhat.component=keepalived-container, io.openshift.expose-services=, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec  6 05:26:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:03.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:03.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:04 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:04 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:04 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:04 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:04 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  6 05:26:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:05 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  6 05:26:05 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:26:05 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:05 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:05 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:26:05 np0005548918 ceph-mon[75798]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec  6 05:26:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:05.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:26:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:05.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:26:06 np0005548918 nova_compute[229246]: 2025-12-06 10:26:06.254 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:06 np0005548918 nova_compute[229246]: 2025-12-06 10:26:06.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:06 np0005548918 nova_compute[229246]: 2025-12-06 10:26:06.557 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:26:06 np0005548918 nova_compute[229246]: 2025-12-06 10:26:06.557 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:26:06 np0005548918 nova_compute[229246]: 2025-12-06 10:26:06.557 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:26:06 np0005548918 nova_compute[229246]: 2025-12-06 10:26:06.557 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:26:06 np0005548918 nova_compute[229246]: 2025-12-06 10:26:06.558 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:26:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:07 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:26:07 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3490516856' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:26:07 np0005548918 nova_compute[229246]: 2025-12-06 10:26:07.029 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:26:07 np0005548918 nova_compute[229246]: 2025-12-06 10:26:07.183 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:26:07 np0005548918 nova_compute[229246]: 2025-12-06 10:26:07.185 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4763MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:26:07 np0005548918 nova_compute[229246]: 2025-12-06 10:26:07.185 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:26:07 np0005548918 nova_compute[229246]: 2025-12-06 10:26:07.185 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:26:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:07 np0005548918 nova_compute[229246]: 2025-12-06 10:26:07.341 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:26:07 np0005548918 nova_compute[229246]: 2025-12-06 10:26:07.341 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:26:07 np0005548918 nova_compute[229246]: 2025-12-06 10:26:07.361 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:26:07 np0005548918 nova_compute[229246]: 2025-12-06 10:26:07.611 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:07.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:07 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:26:07 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/239905449' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:26:07 np0005548918 nova_compute[229246]: 2025-12-06 10:26:07.810 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:26:07 np0005548918 nova_compute[229246]: 2025-12-06 10:26:07.815 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:26:07 np0005548918 nova_compute[229246]: 2025-12-06 10:26:07.838 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:26:07 np0005548918 nova_compute[229246]: 2025-12-06 10:26:07.839 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:26:07 np0005548918 nova_compute[229246]: 2025-12-06 10:26:07.840 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.654s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:26:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:07.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:08 np0005548918 podman[253584]: 2025-12-06 10:26:08.208440299 +0000 UTC m=+0.089811586 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  6 05:26:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:08 np0005548918 nova_compute[229246]: 2025-12-06 10:26:08.836 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:09.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:26:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:09.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:26:10 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:10 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:10 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:26:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:11 np0005548918 nova_compute[229246]: 2025-12-06 10:26:11.256 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:11.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:11.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:12 np0005548918 podman[253643]: 2025-12-06 10:26:12.15799661 +0000 UTC m=+0.051672746 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:26:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:12 np0005548918 nova_compute[229246]: 2025-12-06 10:26:12.613 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:13.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:13.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:15.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:15.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:16 np0005548918 nova_compute[229246]: 2025-12-06 10:26:16.258 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:17 np0005548918 nova_compute[229246]: 2025-12-06 10:26:17.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:17 np0005548918 nova_compute[229246]: 2025-12-06 10:26:17.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  6 05:26:17 np0005548918 nova_compute[229246]: 2025-12-06 10:26:17.615 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:17.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:17.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:19.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:19.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.138423) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780138450, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1286, "num_deletes": 255, "total_data_size": 2971840, "memory_usage": 3005616, "flush_reason": "Manual Compaction"}
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780152496, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 1942990, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39455, "largest_seqno": 40736, "table_properties": {"data_size": 1937422, "index_size": 2900, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12264, "raw_average_key_size": 19, "raw_value_size": 1926054, "raw_average_value_size": 3116, "num_data_blocks": 125, "num_entries": 618, "num_filter_entries": 618, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016684, "oldest_key_time": 1765016684, "file_creation_time": 1765016780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 14113 microseconds, and 5619 cpu microseconds.
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.152534) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 1942990 bytes OK
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.152552) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.154432) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.154446) EVENT_LOG_v1 {"time_micros": 1765016780154442, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.154461) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 2965655, prev total WAL file size 2965655, number of live WAL files 2.
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.155312) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303035' seq:72057594037927935, type:22 .. '6C6F676D0031323536' seq:0, type:0; will stop at (end)
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(1897KB)], [75(12MB)]
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780155391, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15268439, "oldest_snapshot_seqno": -1}
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6853 keys, 15099864 bytes, temperature: kUnknown
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780254850, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 15099864, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15054806, "index_size": 26834, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 180441, "raw_average_key_size": 26, "raw_value_size": 14932119, "raw_average_value_size": 2178, "num_data_blocks": 1056, "num_entries": 6853, "num_filter_entries": 6853, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014009, "oldest_key_time": 0, "file_creation_time": 1765016780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "32a3339c-60f1-43dd-9342-fe763f9bb1ae", "db_session_id": "XHXHRBK6HNZMFGDONKUX", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.255127) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 15099864 bytes
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.256979) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.6 rd, 151.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.7 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(15.6) write-amplify(7.8) OK, records in: 7382, records dropped: 529 output_compression: NoCompression
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.256995) EVENT_LOG_v1 {"time_micros": 1765016780256987, "job": 46, "event": "compaction_finished", "compaction_time_micros": 99399, "compaction_time_cpu_micros": 34814, "output_level": 6, "num_output_files": 1, "total_output_size": 15099864, "num_input_records": 7382, "num_output_records": 6853, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780257466, "job": 46, "event": "table_file_deletion", "file_number": 77}
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780259647, "job": 46, "event": "table_file_deletion", "file_number": 75}
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.155225) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.259708) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.259713) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.259715) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.259716) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: rocksdb: (Original Log Time 2025/12/06-10:26:20.259717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  6 05:26:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:21 np0005548918 podman[253696]: 2025-12-06 10:26:21.191091516 +0000 UTC m=+0.069164378 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  6 05:26:21 np0005548918 nova_compute[229246]: 2025-12-06 10:26:21.260 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:21.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:21.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:22 np0005548918 nova_compute[229246]: 2025-12-06 10:26:22.619 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:23.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:23.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:25.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:26.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:26 np0005548918 nova_compute[229246]: 2025-12-06 10:26:26.261 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:27 np0005548918 nova_compute[229246]: 2025-12-06 10:26:27.622 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:27.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:28.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:29.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:30.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:31 np0005548918 nova_compute[229246]: 2025-12-06 10:26:31.263 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:31.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:32.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:32 np0005548918 nova_compute[229246]: 2025-12-06 10:26:32.624 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:33.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:34.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:35.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:36.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:36 np0005548918 nova_compute[229246]: 2025-12-06 10:26:36.265 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:37 np0005548918 nova_compute[229246]: 2025-12-06 10:26:37.626 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:37.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:38.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:39 np0005548918 podman[253762]: 2025-12-06 10:26:39.239340897 +0000 UTC m=+0.121501722 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  6 05:26:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:39.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:40.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:41 np0005548918 nova_compute[229246]: 2025-12-06 10:26:41.266 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:26:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:41.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:26:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:42.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:42 np0005548918 nova_compute[229246]: 2025-12-06 10:26:42.628 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:42 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:43 np0005548918 podman[253792]: 2025-12-06 10:26:43.183176555 +0000 UTC m=+0.057781441 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec  6 05:26:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:43 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:43 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:43 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:43.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:43 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:43 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:44 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:44 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:44 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:44.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:44 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:44 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:45 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:45 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:45 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:45 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:45.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:45 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 05:26:45 np0005548918 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  6 05:26:45 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:45 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  6 05:26:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/430995373' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  6 05:26:46 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  6 05:26:46 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/430995373' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  6 05:26:46 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:46 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:46 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:46.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:46 np0005548918 nova_compute[229246]: 2025-12-06 10:26:46.267 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:46 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:46 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:47 np0005548918 nova_compute[229246]: 2025-12-06 10:26:47.632 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:47 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:47 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:47 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:47.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:47 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:47 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:48 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:48 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:48 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:48.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:48 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:48 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:49 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:49 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:49 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:49.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:49 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:49 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:50 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:50 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:50 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:50.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:50 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:50 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:50 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:51 np0005548918 nova_compute[229246]: 2025-12-06 10:26:51.269 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:51 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:51 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:51 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:51.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:51 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:51 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:52 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:52 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:52 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:52.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:52 np0005548918 podman[253824]: 2025-12-06 10:26:52.198028981 +0000 UTC m=+0.078169212 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  6 05:26:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:52 np0005548918 nova_compute[229246]: 2025-12-06 10:26:52.634 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:52 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:52 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:26:53.699 141640 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:26:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:26:53.700 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:26:53 np0005548918 ovn_metadata_agent[141635]: 2025-12-06 10:26:53.700 141640 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:26:53 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:53 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:53 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:53.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:53 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:53 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:54 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:54 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:54 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:54.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:54 np0005548918 nova_compute[229246]: 2025-12-06 10:26:54.550 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:54 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:54 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:55 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:26:55 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:55 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:55 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:55.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:55 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:55 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:56 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:56 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:56 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:56.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:56 np0005548918 nova_compute[229246]: 2025-12-06 10:26:56.271 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:56 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:56 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:57 np0005548918 nova_compute[229246]: 2025-12-06 10:26:57.636 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:26:57 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:57 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:26:57 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:57.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:26:57 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:57 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:58 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:58 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:58 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:58.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:58 np0005548918 nova_compute[229246]: 2025-12-06 10:26:58.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:58 np0005548918 nova_compute[229246]: 2025-12-06 10:26:58.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:58 np0005548918 nova_compute[229246]: 2025-12-06 10:26:58.536 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  6 05:26:58 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:58 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:26:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:26:59 np0005548918 nova_compute[229246]: 2025-12-06 10:26:59.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:59 np0005548918 nova_compute[229246]: 2025-12-06 10:26:59.536 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:26:59 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:26:59 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:26:59 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:59.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:26:59 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:26:59 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:00 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:00 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:27:00 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:00.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:27:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:00 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:00 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:00 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:01 np0005548918 nova_compute[229246]: 2025-12-06 10:27:01.272 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:01 np0005548918 nova_compute[229246]: 2025-12-06 10:27:01.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:27:01 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:01 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:01 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:01.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:01 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:01 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:02 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:02 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:02 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:02.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:02 np0005548918 nova_compute[229246]: 2025-12-06 10:27:02.530 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:27:02 np0005548918 nova_compute[229246]: 2025-12-06 10:27:02.557 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:27:02 np0005548918 nova_compute[229246]: 2025-12-06 10:27:02.557 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  6 05:27:02 np0005548918 nova_compute[229246]: 2025-12-06 10:27:02.557 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  6 05:27:02 np0005548918 nova_compute[229246]: 2025-12-06 10:27:02.573 229250 DEBUG nova.compute.manager [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  6 05:27:02 np0005548918 nova_compute[229246]: 2025-12-06 10:27:02.654 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:02 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:02 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:03 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:03 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:03 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:03.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:03 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:03 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:04 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:04 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:27:04 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:04.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:27:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:04 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:04 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:05 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:05 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:05 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:05 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:05.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:05 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:05 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:06 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:06 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:06 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:06.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:06 np0005548918 nova_compute[229246]: 2025-12-06 10:27:06.306 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:06 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:06 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:07 np0005548918 nova_compute[229246]: 2025-12-06 10:27:07.574 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:27:07 np0005548918 nova_compute[229246]: 2025-12-06 10:27:07.656 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:07 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:07 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:27:07 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:07.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:27:07 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:07 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:08 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:08 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:27:08 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:08.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:27:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:08 np0005548918 nova_compute[229246]: 2025-12-06 10:27:08.535 229250 DEBUG oslo_service.periodic_task [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  6 05:27:08 np0005548918 nova_compute[229246]: 2025-12-06 10:27:08.602 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:27:08 np0005548918 nova_compute[229246]: 2025-12-06 10:27:08.602 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:27:08 np0005548918 nova_compute[229246]: 2025-12-06 10:27:08.602 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:27:08 np0005548918 nova_compute[229246]: 2025-12-06 10:27:08.603 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  6 05:27:08 np0005548918 nova_compute[229246]: 2025-12-06 10:27:08.603 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:27:08 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:08 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:09 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:27:09 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1908876995' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:27:09 np0005548918 nova_compute[229246]: 2025-12-06 10:27:09.029 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:27:09 np0005548918 nova_compute[229246]: 2025-12-06 10:27:09.167 229250 WARNING nova.virt.libvirt.driver [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  6 05:27:09 np0005548918 nova_compute[229246]: 2025-12-06 10:27:09.169 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4821MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  6 05:27:09 np0005548918 nova_compute[229246]: 2025-12-06 10:27:09.169 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  6 05:27:09 np0005548918 nova_compute[229246]: 2025-12-06 10:27:09.169 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  6 05:27:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:09 np0005548918 nova_compute[229246]: 2025-12-06 10:27:09.638 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  6 05:27:09 np0005548918 nova_compute[229246]: 2025-12-06 10:27:09.638 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  6 05:27:09 np0005548918 nova_compute[229246]: 2025-12-06 10:27:09.664 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing inventories for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  6 05:27:09 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:09 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:27:09 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:09.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:27:09 np0005548918 nova_compute[229246]: 2025-12-06 10:27:09.870 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating ProviderTree inventory for provider 31f5f484-bf36-44de-83b8-7b434061a77b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  6 05:27:09 np0005548918 nova_compute[229246]: 2025-12-06 10:27:09.870 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Updating inventory in ProviderTree for provider 31f5f484-bf36-44de-83b8-7b434061a77b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  6 05:27:09 np0005548918 nova_compute[229246]: 2025-12-06 10:27:09.916 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing aggregate associations for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  6 05:27:09 np0005548918 nova_compute[229246]: 2025-12-06 10:27:09.935 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Refreshing trait associations for resource provider 31f5f484-bf36-44de-83b8-7b434061a77b, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE4A,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_ABM,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,HW_CPU_X86_AVX2,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  6 05:27:09 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:09 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:09 np0005548918 nova_compute[229246]: 2025-12-06 10:27:09.952 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  6 05:27:10 np0005548918 podman[253933]: 2025-12-06 10:27:10.043008324 +0000 UTC m=+0.094714519 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  6 05:27:10 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:10 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:10 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:10.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:10 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  6 05:27:10 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3374521652' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  6 05:27:10 np0005548918 nova_compute[229246]: 2025-12-06 10:27:10.420 229250 DEBUG oslo_concurrency.processutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  6 05:27:10 np0005548918 nova_compute[229246]: 2025-12-06 10:27:10.426 229250 DEBUG nova.compute.provider_tree [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed in ProviderTree for provider: 31f5f484-bf36-44de-83b8-7b434061a77b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  6 05:27:10 np0005548918 nova_compute[229246]: 2025-12-06 10:27:10.833 229250 DEBUG nova.scheduler.client.report [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Inventory has not changed for provider 31f5f484-bf36-44de-83b8-7b434061a77b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  6 05:27:10 np0005548918 nova_compute[229246]: 2025-12-06 10:27:10.835 229250 DEBUG nova.compute.resource_tracker [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  6 05:27:10 np0005548918 nova_compute[229246]: 2025-12-06 10:27:10.836 229250 DEBUG oslo_concurrency.lockutils [None req-fb90d993-3d6a-4638-a4cf-c8b9cf51b2a7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  6 05:27:10 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:10 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:11 np0005548918 nova_compute[229246]: 2025-12-06 10:27:11.309 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:11 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:11 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:27:11 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:11.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:27:11 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:11 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:12 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:12 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:12 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:12.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:12 np0005548918 nova_compute[229246]: 2025-12-06 10:27:12.658 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:12 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:12 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:13 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:13 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:27:13 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:13.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:27:13 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:13 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:14 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:14 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:14 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:14.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:14 np0005548918 podman[254040]: 2025-12-06 10:27:14.176996668 +0000 UTC m=+0.062090939 container health_status ee3aacf8b59d5240541e6c363be273a14f0c6c6cd546ba1f5c0c5db67df059b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Dec  6 05:27:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:14 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:14 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:14 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:27:14 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:27:14 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  6 05:27:14 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:27:14 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:27:14 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  6 05:27:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:15 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:15 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:15 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:15 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:15.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:15 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:15 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:15 np0005548918 ceph-mon[75798]: Health check update: 2 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec  6 05:27:16 np0005548918 systemd-logind[800]: New session 58 of user zuul.
Dec  6 05:27:16 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:16 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:27:16 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:16.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:27:16 np0005548918 systemd[1]: Started Session 58 of User zuul.
Dec  6 05:27:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:16 np0005548918 nova_compute[229246]: 2025-12-06 10:27:16.310 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:16 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:16 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:17 np0005548918 nova_compute[229246]: 2025-12-06 10:27:17.694 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:17 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:17 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:27:17 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:17.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:27:17 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:17 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:18 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:18 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:18 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:18.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:18 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:18 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:19 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec  6 05:27:19 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3613403269' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  6 05:27:19 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:19 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:27:19 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:19.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:27:19 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:19 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:20 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:20 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:20 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:20.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:20 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:20 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:20 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:21 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:27:21 np0005548918 ceph-mon[75798]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec  6 05:27:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:21 np0005548918 nova_compute[229246]: 2025-12-06 10:27:21.363 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:21 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:21 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:21 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:21.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:21 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:21 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:22 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:22 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:22 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:22.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:22 np0005548918 nova_compute[229246]: 2025-12-06 10:27:22.696 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:22 np0005548918 ovs-vsctl[254438]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  6 05:27:22 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:22 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:23 np0005548918 podman[254476]: 2025-12-06 10:27:23.213828598 +0000 UTC m=+0.089198690 container health_status 33634f88a41178009f6367b4c96628c4981e1ab84919188f33072f05db349e21 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  6 05:27:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:23 np0005548918 virtqemud[228866]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  6 05:27:23 np0005548918 virtqemud[228866]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  6 05:27:23 np0005548918 virtqemud[228866]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  6 05:27:23 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:23 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:27:23 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:23.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:27:23 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:23 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:24 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:24 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:24 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:24.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:24 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: cache status {prefix=cache status} (starting...)
Dec  6 05:27:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:24 np0005548918 lvm[254781]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  6 05:27:24 np0005548918 lvm[254781]: VG ceph_vg0 finished
Dec  6 05:27:24 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: client ls {prefix=client ls} (starting...)
Dec  6 05:27:24 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:24 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:24 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: damage ls {prefix=damage ls} (starting...)
Dec  6 05:27:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Dec  6 05:27:25 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2556019190' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec  6 05:27:25 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: dump loads {prefix=dump loads} (starting...)
Dec  6 05:27:25 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec  6 05:27:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:25 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec  6 05:27:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec  6 05:27:25 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2975038871' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  6 05:27:25 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec  6 05:27:25 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec  6 05:27:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Dec  6 05:27:25 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2998337513' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec  6 05:27:25 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:25 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:25 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:25.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:25 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec  6 05:27:25 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec  6 05:27:25 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1843947048' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  6 05:27:25 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:25 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:26 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec  6 05:27:26 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:26 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:26 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:26.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:26 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: ops {prefix=ops} (starting...)
Dec  6 05:27:26 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec  6 05:27:26 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3661082603' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec  6 05:27:26 np0005548918 nova_compute[229246]: 2025-12-06 10:27:26.396 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:26 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec  6 05:27:26 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2198453763' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec  6 05:27:26 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  6 05:27:26 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3347068796' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  6 05:27:26 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: session ls {prefix=session ls} (starting...)
Dec  6 05:27:26 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:26 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:27 np0005548918 ceph-mds[84319]: mds.cephfs.compute-2.czucwy asok_command: status {prefix=status} (starting...)
Dec  6 05:27:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  6 05:27:27 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1082146717' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  6 05:27:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  6 05:27:27 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3948499551' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  6 05:27:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Dec  6 05:27:27 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/98991412' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec  6 05:27:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  6 05:27:27 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2140146032' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  6 05:27:27 np0005548918 nova_compute[229246]: 2025-12-06 10:27:27.728 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:27 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:27 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:27:27 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:27.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:27:27 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:27 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:27 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec  6 05:27:27 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2585617435' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec  6 05:27:28 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:28 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:28 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:28.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:28 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:28 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  6 05:27:29 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/933451133' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  6 05:27:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec  6 05:27:29 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/409247307' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec  6 05:27:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec  6 05:27:29 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2687712743' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec  6 05:27:29 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:29 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:27:29 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:29.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:27:29 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  6 05:27:29 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2133887337' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  6 05:27:29 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:29 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:30 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:30 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:30 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:30.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  6 05:27:30 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1290425549' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  6 05:27:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 37.452941895s of 37.460189819s, submitted: 2
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 40960 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 32768 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 32768 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c7f1a800 session 0x55f8c7f99a40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855979 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 58.469783783s of 58.474414825s, submitted: 1
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c83f7a40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 857491 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74244096 unmapped: 81920 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74244096 unmapped: 81920 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74244096 unmapped: 81920 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74244096 unmapped: 81920 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 857491 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74244096 unmapped: 81920 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 857491 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74260480 unmapped: 65536 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 857491 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.103944778s of 17.108892441s, submitted: 1
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860515 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 57344 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 49152 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 32768 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 32768 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 32768 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 32768 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 32768 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 24576 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 24576 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 24576 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 24576 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 24576 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 24576 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 24576 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3308729779' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 1015808 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 1015808 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 983040 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 983040 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 983040 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 983040 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859924 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 983040 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 75.162078857s of 75.172508240s, submitted: 3
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862948 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74416128 unmapped: 958464 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74416128 unmapped: 958464 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74416128 unmapped: 958464 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74416128 unmapped: 958464 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74416128 unmapped: 958464 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74448896 unmapped: 925696 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74465280 unmapped: 909312 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74465280 unmapped: 909312 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74465280 unmapped: 909312 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74465280 unmapped: 909312 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74465280 unmapped: 909312 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74498048 unmapped: 876544 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74498048 unmapped: 876544 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74498048 unmapped: 876544 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8052780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861766 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 802816 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 131.492523193s of 131.637023926s, submitted: 4
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 794624 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 794624 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 794624 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863278 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread fragmentation_score=0.000025 took=0.000037s
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 770048 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 6030 writes, 25K keys, 6030 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 6030 writes, 1100 syncs, 5.48 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 460 writes, 719 keys, 460 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s#012Interval WAL: 460 writes, 225 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.043       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f8c486a9b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f8c486a9b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 753664 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862096 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 105.746887207s of 105.757682800s, submitted: 3
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863608 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 729088 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 729088 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 729088 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.876320839s of 14.883418083s, submitted: 2
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75735040 unmapped: 1736704 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 1654784 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1630208 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75849728 unmapped: 1622016 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75849728 unmapped: 1622016 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8db10e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863017 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 66.274696350s of 66.843589783s, submitted: 197
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1613824 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75866112 unmapped: 1605632 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862426 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 handle_osd_map epochs [146,147], i have 146, src has [1,147]
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 96.137619019s of 96.141578674s, submitted: 1
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 146 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fca6c000/0x0/0x4ffc00000, data 0xfefc6/0x1b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1581056 heap: 77471744 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 147 handle_osd_map epochs [147,148], i have 147, src has [1,148]
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 147 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 77037568 unmapped: 17219584 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 148 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 149 ms_handle_reset con 0x55f8c564b800 session 0x55f8c60061e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 77045760 unmapped: 17211392 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 77053952 unmapped: 17203200 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 77053952 unmapped: 17203200 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937851 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 150 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8ed2b40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fc25c000/0x0/0x4ffc00000, data 0x905373/0x9be000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fc25a000/0x0/0x4ffc00000, data 0x90747b/0x9c1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc25a000/0x0/0x4ffc00000, data 0x90747b/0x9c1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941715 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941715 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941715 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c6ca1a40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941715 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8f012c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941715 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941715 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 39.368801117s of 39.591522217s, submitted: 60
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943227 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 16146432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78135296 unmapped: 16121856 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78135296 unmapped: 16121856 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc257000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78135296 unmapped: 16121856 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 16113664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943899 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc258000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 16113664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 16113664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 16113664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 16113664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc258000/0x0/0x4ffc00000, data 0x90944d/0x9c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 16113664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c8adc400 session 0x55f8c8f2c960
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943899 data_alloc: 218103808 data_used: 49152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8f2cb40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8f2cd20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 78135296 unmapped: 16121856 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.870017052s of 11.323535919s, submitted: 2
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8f2cf00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8f2d860
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 8380416 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 ms_handle_reset con 0x55f8c8adc800 session 0x55f8c8f2cb40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 85884928 unmapped: 8372224 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 152 handle_osd_map epochs [152,153], i have 152, src has [1,153]
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8f012c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c5eaf0e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8f0c960
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8f0cf00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c8adcc00 session 0x55f8c8ef9680
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 89407488 unmapped: 15417344 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 153 heartbeat osd_stat(store_statfs(0x4fb480000/0x0/0x4ffc00000, data 0x16dd679/0x179a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8f0d4a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 89489408 unmapped: 15335424 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1083921 data_alloc: 218103808 data_used: 6873088
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 153 heartbeat osd_stat(store_statfs(0x4fb480000/0x0/0x4ffc00000, data 0x16dd679/0x179a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 89522176 unmapped: 15302656 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8f0d860
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 89522176 unmapped: 15302656 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8f0da40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 153 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8f0de00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 153 heartbeat osd_stat(store_statfs(0x4fb480000/0x0/0x4ffc00000, data 0x16dd679/0x179a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 89407488 unmapped: 15417344 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 89776128 unmapped: 15048704 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 153 handle_osd_map epochs [153,154], i have 153, src has [1,154]
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175100 data_alloc: 234881024 data_used: 17768448
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb47e000/0x0/0x4ffc00000, data 0x16df64b/0x179d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1a800 session 0x55f8c8f00780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb47e000/0x0/0x4ffc00000, data 0x16df64b/0x179d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175100 data_alloc: 234881024 data_used: 17768448
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb47e000/0x0/0x4ffc00000, data 0x16df64b/0x179d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100442112 unmapped: 4382720 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 100466688 unmapped: 4358144 heap: 104824832 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.688327789s of 18.928897858s, submitted: 82
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193376 data_alloc: 234881024 data_used: 17817600
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110518272 unmapped: 3989504 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109641728 unmapped: 4866048 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109641728 unmapped: 4866048 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9137000/0x0/0x4ffc00000, data 0x288664b/0x2944000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109682688 unmapped: 4825088 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9137000/0x0/0x4ffc00000, data 0x288664b/0x2944000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109682688 unmapped: 4825088 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1327180 data_alloc: 234881024 data_used: 19890176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109682688 unmapped: 4825088 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109682688 unmapped: 4825088 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109862912 unmapped: 4644864 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9114000/0x0/0x4ffc00000, data 0x28aa64b/0x2968000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 4603904 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 4603904 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1323668 data_alloc: 234881024 data_used: 19959808
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 4603904 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 4603904 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 4603904 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.284299850s of 13.593171120s, submitted: 156
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 4603904 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f910a000/0x0/0x4ffc00000, data 0x28b464b/0x2972000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109936640 unmapped: 4571136 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1323085 data_alloc: 234881024 data_used: 19959808
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109936640 unmapped: 4571136 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109936640 unmapped: 4571136 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109936640 unmapped: 4571136 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f910a000/0x0/0x4ffc00000, data 0x28b464b/0x2972000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109944832 unmapped: 4562944 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109953024 unmapped: 4554752 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1323005 data_alloc: 234881024 data_used: 19959808
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8ed32c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8ed2f00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8daf4a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 109953024 unmapped: 4554752 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8915e00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add800 session 0x55f8c8ee1c20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 3399680 heap: 114507776 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c7f80f00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8ef8d20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8dafc20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8db1680
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8addc00 session 0x55f8c8063e00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c89145a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111419392 unmapped: 5259264 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cff000/0x0/0x4ffc00000, data 0x2cbd684/0x2d7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111419392 unmapped: 5259264 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111427584 unmapped: 5251072 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1362318 data_alloc: 234881024 data_used: 20353024
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111427584 unmapped: 5251072 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111427584 unmapped: 5251072 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111427584 unmapped: 5251072 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cff000/0x0/0x4ffc00000, data 0x2cbd6bd/0x2d7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111460352 unmapped: 5218304 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111460352 unmapped: 5218304 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1362318 data_alloc: 234881024 data_used: 20353024
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.106805801s of 17.256025314s, submitted: 42
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8ee81e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111476736 unmapped: 5201920 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111484928 unmapped: 5193728 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cff000/0x0/0x4ffc00000, data 0x2cbd6bd/0x2d7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,3,0,3])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113360896 unmapped: 3317760 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114819072 unmapped: 1859584 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114819072 unmapped: 1859584 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1390523 data_alloc: 234881024 data_used: 24223744
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cfe000/0x0/0x4ffc00000, data 0x2cbd6bd/0x2d7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114827264 unmapped: 1851392 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114827264 unmapped: 1851392 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114827264 unmapped: 1851392 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cfe000/0x0/0x4ffc00000, data 0x2cbd6bd/0x2d7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114827264 unmapped: 1851392 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cfe000/0x0/0x4ffc00000, data 0x2cbd6bd/0x2d7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 1843200 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1390523 data_alloc: 234881024 data_used: 24223744
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 1843200 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.404649734s of 10.734865189s, submitted: 13
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cfe000/0x0/0x4ffc00000, data 0x2cbd6bd/0x2d7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 1843200 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 1843200 heap: 116678656 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 117628928 unmapped: 5013504 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116768768 unmapped: 5873664 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1491925 data_alloc: 234881024 data_used: 25108480
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116768768 unmapped: 5873664 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f81b4000/0x0/0x4ffc00000, data 0x38086bd/0x38c8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f81b4000/0x0/0x4ffc00000, data 0x38086bd/0x38c8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116776960 unmapped: 5865472 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f81b1000/0x0/0x4ffc00000, data 0x380b6bd/0x38cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 5849088 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f81b1000/0x0/0x4ffc00000, data 0x380b6bd/0x38cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f81b1000/0x0/0x4ffc00000, data 0x380b6bd/0x38cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 5849088 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 5849088 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1490653 data_alloc: 234881024 data_used: 25112576
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 6242304 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 6242304 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8190000/0x0/0x4ffc00000, data 0x382c6bd/0x38ec000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.670058250s of 11.008224487s, submitted: 128
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 6242304 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8052960
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8f29c20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f818d000/0x0/0x4ffc00000, data 0x382f6bd/0x38ef000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 6242304 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc8000 session 0x55f8c8ee0000
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 9650176 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337761 data_alloc: 234881024 data_used: 20291584
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 9650176 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c8f28b40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f90fd000/0x0/0x4ffc00000, data 0x28c064b/0x297e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 9650176 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113000448 unmapped: 9641984 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113000448 unmapped: 9641984 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c83f74a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add400 session 0x55f8c8ee8780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113008640 unmapped: 9633792 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1338625 data_alloc: 234881024 data_used: 20291584
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f90fd000/0x0/0x4ffc00000, data 0x28c064b/0x297e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,1,1])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c7e0fa40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb0af000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1007743 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb0af000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb0af000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1007743 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 20996096 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.560182571s of 20.807836533s, submitted: 93
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101662720 unmapped: 20979712 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb0af000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101662720 unmapped: 20979712 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1010767 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101662720 unmapped: 20979712 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb0af000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1009585 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb0af000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1009585 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fb0af000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 101687296 unmapped: 20955136 heap: 122642432 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc8000 session 0x55f8c82510e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8250000
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c61a70e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c7f98d20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.574503899s of 13.586823463s, submitted: 4
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add400 session 0x55f8c7f990e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc8400 session 0x55f8c83f7680
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102006784 unmapped: 26935296 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c89145a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c8915860
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c8914780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102006784 unmapped: 26935296 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102006784 unmapped: 26935296 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102014976 unmapped: 26927104 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1040148 data_alloc: 218103808 data_used: 5046272
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add400 session 0x55f8c89141e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fad2b000/0x0/0x4ffc00000, data 0xc916bd/0xd51000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc8400 session 0x55f8c8efb860
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102014976 unmapped: 26927104 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102014976 unmapped: 26927104 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8efbe00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c8efa3c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fad2a000/0x0/0x4ffc00000, data 0xc916cd/0xd52000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068351 data_alloc: 218103808 data_used: 8724480
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fad2a000/0x0/0x4ffc00000, data 0xc916cd/0xd52000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068351 data_alloc: 218103808 data_used: 8724480
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fad2a000/0x0/0x4ffc00000, data 0xc916cd/0xd52000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102064128 unmapped: 26877952 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102072320 unmapped: 26869760 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102072320 unmapped: 26869760 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 102072320 unmapped: 26869760 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068351 data_alloc: 218103808 data_used: 8724480
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.456418991s of 18.609983444s, submitted: 33
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 24199168 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa60d000/0x0/0x4ffc00000, data 0xf9e6cd/0x105f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104775680 unmapped: 24166400 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104775680 unmapped: 24166400 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104783872 unmapped: 24158208 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104783872 unmapped: 24158208 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100589 data_alloc: 218103808 data_used: 8921088
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100589 data_alloc: 218103808 data_used: 8921088
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104792064 unmapped: 24150016 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 24141824 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 24141824 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100589 data_alloc: 218103808 data_used: 8921088
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 24141824 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 24141824 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 24141824 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100589 data_alloc: 218103808 data_used: 8921088
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100589 data_alloc: 218103808 data_used: 8921088
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa603000/0x0/0x4ffc00000, data 0xfa86cd/0x1069000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 24133632 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.872806549s of 25.992950439s, submitted: 43
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c8efb4a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add400 session 0x55f8c731f680
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc8800 session 0x55f8c8dae3c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1019398 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1019398 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1019398 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103006208 unmapped: 25935872 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1019398 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1019398 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 25927680 heap: 128942080 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8ee0000
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c8ee1860
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c8ee01e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add400 session 0x55f8c8ee1c20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.310274124s of 28.440805435s, submitted: 35
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc8c00 session 0x55f8c8ee14a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c6ca0d20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103276544 unmapped: 36167680 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1133149 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103276544 unmapped: 36167680 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103276544 unmapped: 36167680 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103276544 unmapped: 36167680 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103284736 unmapped: 36159488 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc5000/0x0/0x4ffc00000, data 0x17e86ad/0x18a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103284736 unmapped: 36159488 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1133149 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103292928 unmapped: 36151296 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c8db1680
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8f2dc20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103301120 unmapped: 36143104 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc4000/0x0/0x4ffc00000, data 0x17e86d0/0x18a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 103301120 unmapped: 36143104 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111468544 unmapped: 27975680 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc4000/0x0/0x4ffc00000, data 0x17e86d0/0x18a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111484928 unmapped: 27959296 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1240039 data_alloc: 234881024 data_used: 20131840
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc4000/0x0/0x4ffc00000, data 0x17e86d0/0x18a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111517696 unmapped: 27926528 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111517696 unmapped: 27926528 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111517696 unmapped: 27926528 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc4000/0x0/0x4ffc00000, data 0x17e86d0/0x18a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111517696 unmapped: 27926528 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111525888 unmapped: 27918336 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1240039 data_alloc: 234881024 data_used: 20131840
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111525888 unmapped: 27918336 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111525888 unmapped: 27918336 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc4000/0x0/0x4ffc00000, data 0x17e86d0/0x18a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111525888 unmapped: 27918336 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.866819382s of 19.239244461s, submitted: 49
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 123478016 unmapped: 15966208 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121561088 unmapped: 17883136 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1411225 data_alloc: 234881024 data_used: 22036480
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 17825792 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 17825792 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 17825792 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f88c6000/0x0/0x4ffc00000, data 0x2ce66d0/0x2da6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 17801216 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 17801216 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1415017 data_alloc: 234881024 data_used: 22257664
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 17801216 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 17801216 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f88c4000/0x0/0x4ffc00000, data 0x2ce86d0/0x2da8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 17752064 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 17752064 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.175070763s of 10.511266708s, submitted: 180
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 17752064 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1412699 data_alloc: 234881024 data_used: 22331392
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 17752064 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f88c2000/0x0/0x4ffc00000, data 0x2cea6d0/0x2daa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 17719296 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 17719296 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 17719296 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 17719296 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1412931 data_alloc: 234881024 data_used: 22331392
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 17719296 heap: 139444224 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9400 session 0x55f8c61a7860
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9800 session 0x55f8c826b4a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c7de23c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c7de2b40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c7de2780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 20848640 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7f1c000/0x0/0x4ffc00000, data 0x36906d0/0x3750000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 20815872 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 20815872 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7f1c000/0x0/0x4ffc00000, data 0x36906d0/0x3750000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 20783104 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1484749 data_alloc: 234881024 data_used: 22331392
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 20783104 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7f1c000/0x0/0x4ffc00000, data 0x36906d0/0x3750000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 20783104 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.692651749s of 13.755259514s, submitted: 15
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9400 session 0x55f8c6c14960
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 20480000 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 20455424 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 12771328 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1547897 data_alloc: 251658240 data_used: 31326208
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 12771328 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7ef8000/0x0/0x4ffc00000, data 0x36b46d0/0x3774000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7ef8000/0x0/0x4ffc00000, data 0x36b46d0/0x3774000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 12771328 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 12771328 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 12689408 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 12689408 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1547897 data_alloc: 251658240 data_used: 31326208
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7ef8000/0x0/0x4ffc00000, data 0x36b46d0/0x3774000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 12689408 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 12689408 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7ef8000/0x0/0x4ffc00000, data 0x36b46d0/0x3774000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 12656640 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 12656640 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.174408913s of 12.192111969s, submitted: 6
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 130654208 unmapped: 11943936 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1628439 data_alloc: 251658240 data_used: 31383552
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131645440 unmapped: 10952704 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131006464 unmapped: 11591680 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131006464 unmapped: 11591680 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7471000/0x0/0x4ffc00000, data 0x413b6d0/0x41fb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 11493376 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 11493376 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1634727 data_alloc: 251658240 data_used: 31793152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c7f62d20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 11493376 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7471000/0x0/0x4ffc00000, data 0x413b6d0/0x41fb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 11493376 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7471000/0x0/0x4ffc00000, data 0x413b6d0/0x41fb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 11493376 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 11493376 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c6c141e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cb69a000 session 0x55f8c6112780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 11501568 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.638808250s of 10.160227776s, submitted: 89
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1633935 data_alloc: 251658240 data_used: 31793152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c7471a40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125550592 unmapped: 17047552 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f889e000/0x0/0x4ffc00000, data 0x2d0e6d0/0x2dce000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125550592 unmapped: 17047552 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f88c2000/0x0/0x4ffc00000, data 0x2cea6d0/0x2daa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125550592 unmapped: 17047552 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125550592 unmapped: 17047552 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125550592 unmapped: 17047552 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422843 data_alloc: 234881024 data_used: 22331392
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f88c2000/0x0/0x4ffc00000, data 0x2cea6d0/0x2daa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125550592 unmapped: 17047552 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add400 session 0x55f8c6ca1e00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c8230b40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f88c2000/0x0/0x4ffc00000, data 0x2cea6d0/0x2daa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111493120 unmapped: 31105024 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7e7a000 session 0x55f8c8ee90e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1050415 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa95e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1050415 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa95e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.673280716s of 18.836557388s, submitted: 65
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1049240 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1049240 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1049240 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111534080 unmapped: 31064064 heap: 142598144 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1049240 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.104139328s of 17.108509064s, submitted: 1
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8251a40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113516544 unmapped: 41156608 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c5746960
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c83f6780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047400 session 0x55f8c83f7c20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c619c780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112869376 unmapped: 41803776 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112869376 unmapped: 41803776 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112869376 unmapped: 41803776 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c83f7e00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c83f65a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112869376 unmapped: 41803776 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163518 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7e7a000 session 0x55f8c8ee0f00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc7000/0x0/0x4ffc00000, data 0x17e66ad/0x18a5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047000 session 0x55f8c8ee12c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112893952 unmapped: 41779200 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112893952 unmapped: 41779200 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260752 data_alloc: 234881024 data_used: 18857984
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc6000/0x0/0x4ffc00000, data 0x17e66bd/0x18a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc6000/0x0/0x4ffc00000, data 0x17e66bd/0x18a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260752 data_alloc: 234881024 data_used: 18857984
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc6000/0x0/0x4ffc00000, data 0x17e66bd/0x18a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc6000/0x0/0x4ffc00000, data 0x17e66bd/0x18a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc6000/0x0/0x4ffc00000, data 0x17e66bd/0x18a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115924992 unmapped: 38748160 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.263729095s of 17.429533005s, submitted: 55
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9dc6000/0x0/0x4ffc00000, data 0x17e66bd/0x18a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 118702080 unmapped: 35971072 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 35717120 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1327086 data_alloc: 234881024 data_used: 19050496
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 35717120 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 35717120 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 35717120 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 35717120 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96de000/0x0/0x4ffc00000, data 0x1ecd6bd/0x1f8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 35717120 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1327238 data_alloc: 234881024 data_used: 19054592
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96c0000/0x0/0x4ffc00000, data 0x1eec6bd/0x1fac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96c0000/0x0/0x4ffc00000, data 0x1eec6bd/0x1fac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325086 data_alloc: 234881024 data_used: 19058688
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.177231789s of 13.388985634s, submitted: 86
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325222 data_alloc: 234881024 data_used: 19058688
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96b3000/0x0/0x4ffc00000, data 0x1ef96bd/0x1fb9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119209984 unmapped: 35463168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96b3000/0x0/0x4ffc00000, data 0x1ef96bd/0x1fb9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96b3000/0x0/0x4ffc00000, data 0x1ef96bd/0x1fb9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326126 data_alloc: 234881024 data_used: 19066880
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1f036bd/0x1fc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1f036bd/0x1fc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.655777931s of 12.032317162s, submitted: 4
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326518 data_alloc: 234881024 data_used: 19066880
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 35258368 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c731f680
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7e7a000 session 0x55f8c731f0e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c731ef00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1c00 session 0x55f8c731e780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96a0000/0x0/0x4ffc00000, data 0x1f0c6bd/0x1fcc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,4])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 27844608 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add400 session 0x55f8c8053a40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 35176448 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 8458 writes, 34K keys, 8458 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 8458 writes, 2073 syncs, 4.08 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2428 writes, 9161 keys, 2428 commit groups, 1.0 writes per commit group, ingest: 9.91 MB, 0.02 MB/s#012Interval WAL: 2428 writes, 973 syncs, 2.50 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 35176448 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8d0d000/0x0/0x4ffc00000, data 0x289f6bd/0x295f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 35176448 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1399312 data_alloc: 234881024 data_used: 19066880
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c8ee81e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 35176448 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7e7a000 session 0x55f8c8ee9c20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 35176448 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c8ee85a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1c00 session 0x55f8c82305a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119824384 unmapped: 34848768 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 119824384 unmapped: 34848768 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124633088 unmapped: 30040064 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1470527 data_alloc: 234881024 data_used: 25440256
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8ce8000/0x0/0x4ffc00000, data 0x28c36cd/0x2984000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.146208763s of 12.293154716s, submitted: 25
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 30023680 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8ce5000/0x0/0x4ffc00000, data 0x28c66cd/0x2987000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 30023680 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8ce5000/0x0/0x4ffc00000, data 0x28c66cd/0x2987000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8ce5000/0x0/0x4ffc00000, data 0x28c66cd/0x2987000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 30023680 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 30023680 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 30023680 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1470415 data_alloc: 234881024 data_used: 25440256
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 30023680 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124698624 unmapped: 29974528 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124698624 unmapped: 29974528 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8cdf000/0x0/0x4ffc00000, data 0x28cc6cd/0x298d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124698624 unmapped: 29974528 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128163840 unmapped: 26509312 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555717 data_alloc: 234881024 data_used: 26214400
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.894871712s of 10.098556519s, submitted: 67
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127557632 unmapped: 27115520 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128008192 unmapped: 26664960 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f823d000/0x0/0x4ffc00000, data 0x33666cd/0x3427000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f823a000/0x0/0x4ffc00000, data 0x33696cd/0x342a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128073728 unmapped: 26599424 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f823a000/0x0/0x4ffc00000, data 0x33696cd/0x342a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f823a000/0x0/0x4ffc00000, data 0x33696cd/0x342a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1564765 data_alloc: 234881024 data_used: 26984448
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f823a000/0x0/0x4ffc00000, data 0x33696cd/0x342a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1561573 data_alloc: 234881024 data_used: 26984448
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f823d000/0x0/0x4ffc00000, data 0x336e6cd/0x342f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 128106496 unmapped: 26566656 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9400 session 0x55f8c7f99e00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.151363373s of 12.191974640s, submitted: 20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c8250b40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c8915680
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336296 data_alloc: 234881024 data_used: 15409152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f968c000/0x0/0x4ffc00000, data 0x1f206bd/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f968c000/0x0/0x4ffc00000, data 0x1f206bd/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336296 data_alloc: 234881024 data_used: 15409152
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f968c000/0x0/0x4ffc00000, data 0x1f206bd/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f968c000/0x0/0x4ffc00000, data 0x1f206bd/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f968c000/0x0/0x4ffc00000, data 0x1f206bd/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 32243712 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.010532379s of 11.096594810s, submitted: 30
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8ee1860
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8052000
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7e7a000 session 0x55f8c731e3c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1078693 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fab24000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1078693 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fab24000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1078693 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fab24000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114532352 unmapped: 40140800 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1078693 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.836418152s of 16.924983978s, submitted: 44
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114540544 unmapped: 40132608 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114614272 unmapped: 40058880 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113786880 unmapped: 40886272 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c8daf680
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 40869888 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 40869888 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1078401 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 40869888 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 40869888 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 40869888 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 40869888 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8052960
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8063e00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c7f99860
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8add000 session 0x55f8c82303c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8ee94a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c7470960
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c7f62f00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c7f62780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1800 session 0x55f8c7f625a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114089984 unmapped: 40583168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1173436 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa0dc000/0x0/0x4ffc00000, data 0x14d06bd/0x1590000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114089984 unmapped: 40583168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114089984 unmapped: 40583168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114089984 unmapped: 40583168 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 40574976 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c7f63c20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 40574976 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c7f63680
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1173436 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c7f63a40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.971698761s of 14.797449112s, submitted: 253
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c6c14960
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114253824 unmapped: 40419328 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa0dc000/0x0/0x4ffc00000, data 0x14d06bd/0x1590000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114253824 unmapped: 40419328 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 38830080 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 38830080 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 38830080 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258880 data_alloc: 234881024 data_used: 17272832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 38830080 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa0b8000/0x0/0x4ffc00000, data 0x14f46bd/0x15b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa0b8000/0x0/0x4ffc00000, data 0x14f46bd/0x15b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 38830080 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 38830080 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff1c00 session 0x55f8c6c150e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c6c145a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c7e0f680
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa880000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1090772 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.207107544s of 10.267948151s, submitted: 24
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa880000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1090181 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa880000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1090181 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c731e780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c74712c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c8adc000 session 0x55f8c83f7680
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 44285952 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8ee85a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.252075195s of 10.254258156s, submitted: 1
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c83f7860
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c7de2000
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c7f98b40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6d53800 session 0x55f8c8ee0f00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8db12c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa380000/0x0/0x4ffc00000, data 0x122d65b/0x12ec000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111353856 unmapped: 43319296 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa380000/0x0/0x4ffc00000, data 0x122d65b/0x12ec000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111353856 unmapped: 43319296 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111353856 unmapped: 43319296 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c83f7c20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 111353856 unmapped: 43319296 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c7f805a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161277 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c61a6960
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1ac00 session 0x55f8c80625a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112558080 unmapped: 42115072 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112558080 unmapped: 42115072 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114032640 unmapped: 40640512 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa35b000/0x0/0x4ffc00000, data 0x125166b/0x1311000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c6ca1e00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c7f99c20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa35b000/0x0/0x4ffc00000, data 0x125166b/0x1311000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c7de3860
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 41762816 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 41762816 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098645 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 41762816 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 41762816 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 41762816 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 41762816 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 41762816 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098645 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112918528 unmapped: 41754624 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112918528 unmapped: 41754624 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112918528 unmapped: 41754624 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112918528 unmapped: 41754624 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112918528 unmapped: 41754624 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098645 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112918528 unmapped: 41754624 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098645 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112926720 unmapped: 41746432 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112934912 unmapped: 41738240 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098645 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112934912 unmapped: 41738240 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112934912 unmapped: 41738240 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112934912 unmapped: 41738240 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112943104 unmapped: 41730048 heap: 154673152 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c83f63c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864ac00 session 0x55f8c8db0b40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8251c20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c83f65a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.693813324s of 33.888053894s, submitted: 49
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113008640 unmapped: 45342720 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c7dd03c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c6c15e00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864a800 session 0x55f8c8ee1a40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8ee8d20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8ed30e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182204 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112975872 unmapped: 45375488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112975872 unmapped: 45375488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa240000/0x0/0x4ffc00000, data 0x136c6bd/0x142c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112975872 unmapped: 45375488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112975872 unmapped: 45375488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c8ee0780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112975872 unmapped: 45375488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182204 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c7f80960
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864a400 session 0x55f8c61a7c20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8ee90e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 45359104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113000448 unmapped: 45350912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa240000/0x0/0x4ffc00000, data 0x136c6bd/0x142c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113008640 unmapped: 45342720 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115015680 unmapped: 43335680 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115015680 unmapped: 43335680 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255457 data_alloc: 234881024 data_used: 15900672
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa240000/0x0/0x4ffc00000, data 0x136c6bd/0x142c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115015680 unmapped: 43335680 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa240000/0x0/0x4ffc00000, data 0x136c6bd/0x142c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115015680 unmapped: 43335680 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115015680 unmapped: 43335680 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c7de2f00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115015680 unmapped: 43335680 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa240000/0x0/0x4ffc00000, data 0x136c6bd/0x142c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115023872 unmapped: 43327488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255457 data_alloc: 234881024 data_used: 15900672
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115023872 unmapped: 43327488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115023872 unmapped: 43327488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa240000/0x0/0x4ffc00000, data 0x136c6bd/0x142c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: mgrc ms_handle_reset ms_handle_reset con 0x55f8c7359400
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3885409716
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3885409716,v1:192.168.122.100:6801/3885409716]
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: mgrc handle_mgr_configure stats_period=5
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 43180032 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.289899826s of 18.431455612s, submitted: 40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa240000/0x0/0x4ffc00000, data 0x136c6bd/0x142c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c6112f00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 35749888 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 36519936 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1330523 data_alloc: 234881024 data_used: 16965632
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9a33000/0x0/0x4ffc00000, data 0x1b736bd/0x1c33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 36519936 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 36519936 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 36519936 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 36519936 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9a21000/0x0/0x4ffc00000, data 0x1b836bd/0x1c43000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 36519936 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1330523 data_alloc: 234881024 data_used: 16965632
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 36519936 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9a21000/0x0/0x4ffc00000, data 0x1b836bd/0x1c43000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 36511744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 36511744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 36511744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 36511744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1332035 data_alloc: 234881024 data_used: 16965632
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9a21000/0x0/0x4ffc00000, data 0x1b836bd/0x1c43000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 36503552 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 36503552 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.663705826s of 13.881211281s, submitted: 81
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8461860
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047c00 session 0x55f8c8098780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c80521e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1113225 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa95a000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1115366 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.681127548s of 10.794657707s, submitted: 41
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114775 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113967104 unmapped: 44384256 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113975296 unmapped: 44376064 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113975296 unmapped: 44376064 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113975296 unmapped: 44376064 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114775 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113975296 unmapped: 44376064 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113975296 unmapped: 44376064 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fac9f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113975296 unmapped: 44376064 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113975296 unmapped: 44376064 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c7f814a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 113983488 unmapped: 44367872 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c826ab40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864a000 session 0x55f8c7f99e00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114775 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c83cbc00 session 0x55f8c7f99680
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.544075012s of 12.550822258s, submitted: 2
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8efa3c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8053e00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864a000 session 0x55f8c8ee0f00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c8daf2c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f000 session 0x55f8c7e0e960
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 45801472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa4e1000/0x0/0x4ffc00000, data 0x10cd64b/0x118b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 45801472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 45801472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c7f625a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa4e1000/0x0/0x4ffc00000, data 0x10cd64b/0x118b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 45817856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8daeb40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f000 session 0x55f8c7f98780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 45817856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864a000 session 0x55f8c7f98b40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184612 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112541696 unmapped: 45809664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c7f63c20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 112558080 unmapped: 45793280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114581504 unmapped: 43769856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa4e0000/0x0/0x4ffc00000, data 0x10cd65b/0x118c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114581504 unmapped: 43769856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114581504 unmapped: 43769856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1230344 data_alloc: 234881024 data_used: 11735040
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114581504 unmapped: 43769856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa4e0000/0x0/0x4ffc00000, data 0x10cd65b/0x118c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 43761664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 43761664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 43761664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa4e0000/0x0/0x4ffc00000, data 0x10cd65b/0x118c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa4e0000/0x0/0x4ffc00000, data 0x10cd65b/0x118c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 43761664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1230344 data_alloc: 234881024 data_used: 11735040
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114597888 unmapped: 43753472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.516071320s of 16.788757324s, submitted: 36
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 114614272 unmapped: 43737088 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c8efab40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8db01e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c6c145a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f000 session 0x55f8c83f72c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864a000 session 0x55f8c8efaf00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125050880 unmapped: 33300480 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9d40000/0x0/0x4ffc00000, data 0x186c6bd/0x192c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125149184 unmapped: 33202176 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 123346944 unmapped: 35004416 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1350212 data_alloc: 234881024 data_used: 13533184
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c864a000 session 0x55f8c84612c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 123346944 unmapped: 35004416 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c84614a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8460f00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f000 session 0x55f8c8461680
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 123371520 unmapped: 34979840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9902000/0x0/0x4ffc00000, data 0x1ca96bd/0x1d69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 123371520 unmapped: 34979840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124461056 unmapped: 33890304 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9901000/0x0/0x4ffc00000, data 0x1ca96f0/0x1d6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125968384 unmapped: 32382976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1378048 data_alloc: 234881024 data_used: 17047552
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125976576 unmapped: 32374784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125976576 unmapped: 32374784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f98e0000/0x0/0x4ffc00000, data 0x1cca6f0/0x1d8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125976576 unmapped: 32374784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125976576 unmapped: 32374784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.158753395s of 12.559136391s, submitted: 160
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125976576 unmapped: 32374784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1377706 data_alloc: 234881024 data_used: 17051648
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125976576 unmapped: 32374784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f98e0000/0x0/0x4ffc00000, data 0x1cca6f0/0x1d8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125984768 unmapped: 32366592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125984768 unmapped: 32366592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 125984768 unmapped: 32366592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 134922240 unmapped: 23429120 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1478226 data_alloc: 234881024 data_used: 18644992
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133218304 unmapped: 25133056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f89b7000/0x0/0x4ffc00000, data 0x27dd6f0/0x289f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133218304 unmapped: 25133056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133218304 unmapped: 25133056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f89b4000/0x0/0x4ffc00000, data 0x27e06f0/0x28a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133218304 unmapped: 25133056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133218304 unmapped: 25133056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1475920 data_alloc: 234881024 data_used: 18571264
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133218304 unmapped: 25133056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.758771896s of 12.029929161s, submitted: 137
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133226496 unmapped: 25124864 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133234688 unmapped: 25116672 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f89b4000/0x0/0x4ffc00000, data 0x27e06f0/0x28a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133234688 unmapped: 25116672 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133251072 unmapped: 25100288 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1477304 data_alloc: 234881024 data_used: 18571264
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133251072 unmapped: 25100288 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133251072 unmapped: 25100288 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133259264 unmapped: 25092096 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133259264 unmapped: 25092096 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f89b4000/0x0/0x4ffc00000, data 0x27e66f0/0x28a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133267456 unmapped: 25083904 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1475112 data_alloc: 234881024 data_used: 18575360
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9c00 session 0x55f8c8461e00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1b400 session 0x55f8c80621e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8ee94a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 28917760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f89b2000/0x0/0x4ffc00000, data 0x27e76f0/0x28a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 28917760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 28917760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 28917760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 28917760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1322552 data_alloc: 234881024 data_used: 13017088
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.765671730s of 13.941443443s, submitted: 70
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 28893184 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f984e000/0x0/0x4ffc00000, data 0x194f65b/0x1a0e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 28893184 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c74712c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f800 session 0x55f8c82303c0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 28884992 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8daf4a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa88f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1140117 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa88f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa88f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa88f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1140117 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa88f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1140117 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124411904 unmapped: 33939456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124420096 unmapped: 33931264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124420096 unmapped: 33931264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124420096 unmapped: 33931264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa88f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124420096 unmapped: 33931264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1140117 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124420096 unmapped: 33931264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124428288 unmapped: 33923072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124428288 unmapped: 33923072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124428288 unmapped: 33923072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4fa88f000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124428288 unmapped: 33923072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1140117 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124428288 unmapped: 33923072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124428288 unmapped: 33923072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124436480 unmapped: 33914880 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8ef8960
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8ef81e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1b400 session 0x55f8c8ef9860
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f800 session 0x55f8c8ef8b40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124436480 unmapped: 33914880 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.756196976s of 28.830261230s, submitted: 31
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c8251c20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8cabc9000 session 0x55f8c89145a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c8063e00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c619cb40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f1b400 session 0x55f8c8063a40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 33873920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1231334 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9d7e000/0x0/0x4ffc00000, data 0x141e6bd/0x14de000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 33873920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 33873920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 33873920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9d7e000/0x0/0x4ffc00000, data 0x141e6bd/0x14de000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 33873920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f800 session 0x55f8c7de3860
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 33873920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1232858 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 33865728 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9d5a000/0x0/0x4ffc00000, data 0x14426bd/0x1502000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 33865728 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126795776 unmapped: 31555584 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126795776 unmapped: 31555584 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126795776 unmapped: 31555584 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1309770 data_alloc: 234881024 data_used: 16269312
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126795776 unmapped: 31555584 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9d5a000/0x0/0x4ffc00000, data 0x14426bd/0x1502000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 31522816 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 31522816 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9d5a000/0x0/0x4ffc00000, data 0x14426bd/0x1502000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 31522816 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 31522816 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1309770 data_alloc: 234881024 data_used: 16269312
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 31522816 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f9d5a000/0x0/0x4ffc00000, data 0x14426bd/0x1502000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 31522816 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.391180038s of 18.505029678s, submitted: 40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 135880704 unmapped: 22470656 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8089000/0x0/0x4ffc00000, data 0x1f6b6bd/0x202b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 23281664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 23281664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1417360 data_alloc: 234881024 data_used: 18153472
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 23281664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 23281664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 23281664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8007000/0x0/0x4ffc00000, data 0x1ff56bd/0x20b5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 23281664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7fe6000/0x0/0x4ffc00000, data 0x20166bd/0x20d6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133750784 unmapped: 24600576 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1417224 data_alloc: 234881024 data_used: 18153472
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133750784 unmapped: 24600576 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7fe6000/0x0/0x4ffc00000, data 0x20166bd/0x20d6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133750784 unmapped: 24600576 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133750784 unmapped: 24600576 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7fe6000/0x0/0x4ffc00000, data 0x20166bd/0x20d6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133783552 unmapped: 24567808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133783552 unmapped: 24567808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1417528 data_alloc: 234881024 data_used: 18161664
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133783552 unmapped: 24567808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.803490639s of 14.055315018s, submitted: 131
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7fe6000/0x0/0x4ffc00000, data 0x20166bd/0x20d6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133816320 unmapped: 24535040 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 133816320 unmapped: 24535040 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7f8f800 session 0x55f8c8250780
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c564b800 session 0x55f8c82501e0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f7fe0000/0x0/0x4ffc00000, data 0x201c6bd/0x20dc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6047800 session 0x55f8c8daef00
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127041536 unmapped: 31309824 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c6046800 session 0x55f8c8efa5a0
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127066112 unmapped: 31285248 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127066112 unmapped: 31285248 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127066112 unmapped: 31285248 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127066112 unmapped: 31285248 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127066112 unmapped: 31285248 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127074304 unmapped: 31277056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127074304 unmapped: 31277056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127074304 unmapped: 31277056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127074304 unmapped: 31277056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7e7a400 session 0x55f8c6ca1a40
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127074304 unmapped: 31277056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127074304 unmapped: 31277056 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126599168 unmapped: 31752192 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126599168 unmapped: 31752192 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126607360 unmapped: 31744000 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126607360 unmapped: 31744000 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126607360 unmapped: 31744000 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126607360 unmapped: 31744000 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126607360 unmapped: 31744000 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126615552 unmapped: 31735808 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126623744 unmapped: 31727616 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126623744 unmapped: 31727616 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126623744 unmapped: 31727616 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126623744 unmapped: 31727616 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126623744 unmapped: 31727616 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126631936 unmapped: 31719424 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126631936 unmapped: 31719424 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 ms_handle_reset con 0x55f8c7ff0c00 session 0x55f8c8ef9c20
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126631936 unmapped: 31719424 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126631936 unmapped: 31719424 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126631936 unmapped: 31719424 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126631936 unmapped: 31719424 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126640128 unmapped: 31711232 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 31694848 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126664704 unmapped: 31686656 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126664704 unmapped: 31686656 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'config diff' '{prefix=config diff}'
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'config show' '{prefix=config show}'
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'counter dump' '{prefix=counter dump}'
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'counter schema' '{prefix=counter schema}'
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126738432 unmapped: 31612928 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126631936 unmapped: 31719424 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'log dump' '{prefix=log dump}'
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 137674752 unmapped: 20676608 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'perf dump' '{prefix=perf dump}'
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'perf schema' '{prefix=perf schema}'
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126287872 unmapped: 32063488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126287872 unmapped: 32063488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126287872 unmapped: 32063488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126287872 unmapped: 32063488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126287872 unmapped: 32063488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126287872 unmapped: 32063488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126287872 unmapped: 32063488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126287872 unmapped: 32063488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126287872 unmapped: 32063488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126287872 unmapped: 32063488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126287872 unmapped: 32063488 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126304256 unmapped: 32047104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126304256 unmapped: 32047104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126304256 unmapped: 32047104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126304256 unmapped: 32047104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126304256 unmapped: 32047104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126304256 unmapped: 32047104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126304256 unmapped: 32047104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126304256 unmapped: 32047104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126304256 unmapped: 32047104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126304256 unmapped: 32047104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126304256 unmapped: 32047104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126304256 unmapped: 32047104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126304256 unmapped: 32047104 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126312448 unmapped: 32038912 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126320640 unmapped: 32030720 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126320640 unmapped: 32030720 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126328832 unmapped: 32022528 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126328832 unmapped: 32022528 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126328832 unmapped: 32022528 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126328832 unmapped: 32022528 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126328832 unmapped: 32022528 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126328832 unmapped: 32022528 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126328832 unmapped: 32022528 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126328832 unmapped: 32022528 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126328832 unmapped: 32022528 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126328832 unmapped: 32022528 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126328832 unmapped: 32022528 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126328832 unmapped: 32022528 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 32014336 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 32014336 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 32014336 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 32014336 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 32014336 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 32014336 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 32014336 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 32014336 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 32014336 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 32014336 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 32014336 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 32014336 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126345216 unmapped: 32006144 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126345216 unmapped: 32006144 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126345216 unmapped: 32006144 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126345216 unmapped: 32006144 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126345216 unmapped: 32006144 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126353408 unmapped: 31997952 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126353408 unmapped: 31997952 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126353408 unmapped: 31997952 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126353408 unmapped: 31997952 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126353408 unmapped: 31997952 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126353408 unmapped: 31997952 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126353408 unmapped: 31997952 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 31989760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 31989760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 31989760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 31989760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 31989760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 31989760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 31989760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 31989760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 31989760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 31989760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 31989760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 31989760 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126369792 unmapped: 31981568 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126369792 unmapped: 31981568 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126369792 unmapped: 31981568 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126369792 unmapped: 31981568 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126377984 unmapped: 31973376 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126377984 unmapped: 31973376 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126377984 unmapped: 31973376 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126377984 unmapped: 31973376 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126377984 unmapped: 31973376 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126377984 unmapped: 31973376 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126377984 unmapped: 31973376 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126377984 unmapped: 31973376 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126377984 unmapped: 31973376 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126386176 unmapped: 31965184 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126386176 unmapped: 31965184 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126386176 unmapped: 31965184 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126386176 unmapped: 31965184 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126386176 unmapped: 31965184 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126386176 unmapped: 31965184 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126386176 unmapped: 31965184 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126394368 unmapped: 31956992 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126394368 unmapped: 31956992 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126394368 unmapped: 31956992 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126394368 unmapped: 31956992 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126394368 unmapped: 31956992 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126394368 unmapped: 31956992 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126394368 unmapped: 31956992 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126402560 unmapped: 31948800 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126402560 unmapped: 31948800 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126402560 unmapped: 31948800 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126402560 unmapped: 31948800 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126402560 unmapped: 31948800 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126402560 unmapped: 31948800 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126402560 unmapped: 31948800 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126402560 unmapped: 31948800 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126402560 unmapped: 31948800 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126402560 unmapped: 31948800 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126410752 unmapped: 31940608 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126410752 unmapped: 31940608 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126410752 unmapped: 31940608 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126410752 unmapped: 31940608 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126410752 unmapped: 31940608 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126410752 unmapped: 31940608 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126410752 unmapped: 31940608 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126418944 unmapped: 31932416 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126418944 unmapped: 31932416 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126418944 unmapped: 31932416 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126418944 unmapped: 31932416 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126418944 unmapped: 31932416 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126418944 unmapped: 31932416 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126418944 unmapped: 31932416 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126427136 unmapped: 31924224 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126427136 unmapped: 31924224 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126427136 unmapped: 31924224 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126427136 unmapped: 31924224 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126427136 unmapped: 31924224 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126427136 unmapped: 31924224 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126427136 unmapped: 31924224 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126427136 unmapped: 31924224 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126427136 unmapped: 31924224 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126427136 unmapped: 31924224 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126435328 unmapped: 31916032 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126435328 unmapped: 31916032 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126435328 unmapped: 31916032 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126435328 unmapped: 31916032 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126435328 unmapped: 31916032 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126435328 unmapped: 31916032 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126435328 unmapped: 31916032 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126435328 unmapped: 31916032 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126443520 unmapped: 31907840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126443520 unmapped: 31907840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126443520 unmapped: 31907840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126443520 unmapped: 31907840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126443520 unmapped: 31907840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126443520 unmapped: 31907840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126443520 unmapped: 31907840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126443520 unmapped: 31907840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126443520 unmapped: 31907840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126443520 unmapped: 31907840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126443520 unmapped: 31907840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126451712 unmapped: 31899648 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126459904 unmapped: 31891456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126459904 unmapped: 31891456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126468096 unmapped: 31883264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126468096 unmapped: 31883264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126468096 unmapped: 31883264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126468096 unmapped: 31883264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126468096 unmapped: 31883264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126468096 unmapped: 31883264 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126476288 unmapped: 31875072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126476288 unmapped: 31875072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126476288 unmapped: 31875072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126476288 unmapped: 31875072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126476288 unmapped: 31875072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126476288 unmapped: 31875072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126476288 unmapped: 31875072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126476288 unmapped: 31875072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126476288 unmapped: 31875072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126476288 unmapped: 31875072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126476288 unmapped: 31875072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126476288 unmapped: 31875072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126476288 unmapped: 31875072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126476288 unmapped: 31875072 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 10K writes, 42K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 10K writes, 2969 syncs, 3.59 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2193 writes, 7965 keys, 2193 commit groups, 1.0 writes per commit group, ingest: 9.74 MB, 0.02 MB/s#012Interval WAL: 2193 writes, 896 syncs, 2.45 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126484480 unmapped: 31866880 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126484480 unmapped: 31866880 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 31850496 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 31850496 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 31850496 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 31850496 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 31850496 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 31850496 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 31850496 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 31850496 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 31850496 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 31850496 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 31850496 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 31850496 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 31850496 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126509056 unmapped: 31842304 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126509056 unmapped: 31842304 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126509056 unmapped: 31842304 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126509056 unmapped: 31842304 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126509056 unmapped: 31842304 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126509056 unmapped: 31842304 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126509056 unmapped: 31842304 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126509056 unmapped: 31842304 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126509056 unmapped: 31842304 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126509056 unmapped: 31842304 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126509056 unmapped: 31842304 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126517248 unmapped: 31834112 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126517248 unmapped: 31834112 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126517248 unmapped: 31834112 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126517248 unmapped: 31834112 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:30 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126517248 unmapped: 31834112 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126525440 unmapped: 31825920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126525440 unmapped: 31825920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126525440 unmapped: 31825920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126525440 unmapped: 31825920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126525440 unmapped: 31825920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126525440 unmapped: 31825920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126525440 unmapped: 31825920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126525440 unmapped: 31825920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126525440 unmapped: 31825920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126525440 unmapped: 31825920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126525440 unmapped: 31825920 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126533632 unmapped: 31817728 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126533632 unmapped: 31817728 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126533632 unmapped: 31817728 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126533632 unmapped: 31817728 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126533632 unmapped: 31817728 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126533632 unmapped: 31817728 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126533632 unmapped: 31817728 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126533632 unmapped: 31817728 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126533632 unmapped: 31817728 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126533632 unmapped: 31817728 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126541824 unmapped: 31809536 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126541824 unmapped: 31809536 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126541824 unmapped: 31809536 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156744 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126541824 unmapped: 31809536 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126541824 unmapped: 31809536 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126541824 unmapped: 31809536 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f8b4e000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 358.496887207s of 358.562500000s, submitted: 29
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126550016 unmapped: 31801344 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126574592 unmapped: 31776768 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126623744 unmapped: 31727616 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126705664 unmapped: 31645696 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,1,0,1,1])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126763008 unmapped: 31588352 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126861312 unmapped: 31490048 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126861312 unmapped: 31490048 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126861312 unmapped: 31490048 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126861312 unmapped: 31490048 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126861312 unmapped: 31490048 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126861312 unmapped: 31490048 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126861312 unmapped: 31490048 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126861312 unmapped: 31490048 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126861312 unmapped: 31490048 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126861312 unmapped: 31490048 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126861312 unmapped: 31490048 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126869504 unmapped: 31481856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126869504 unmapped: 31481856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126869504 unmapped: 31481856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126869504 unmapped: 31481856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126869504 unmapped: 31481856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126869504 unmapped: 31481856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126869504 unmapped: 31481856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126869504 unmapped: 31481856 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126877696 unmapped: 31473664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126877696 unmapped: 31473664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126877696 unmapped: 31473664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126877696 unmapped: 31473664 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126885888 unmapped: 31465472 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 31457280 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126902272 unmapped: 31449088 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126902272 unmapped: 31449088 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126902272 unmapped: 31449088 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126910464 unmapped: 31440896 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126910464 unmapped: 31440896 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126910464 unmapped: 31440896 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126910464 unmapped: 31440896 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126910464 unmapped: 31440896 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126910464 unmapped: 31440896 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126910464 unmapped: 31440896 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126910464 unmapped: 31440896 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126910464 unmapped: 31440896 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126910464 unmapped: 31440896 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126910464 unmapped: 31440896 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126910464 unmapped: 31440896 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126910464 unmapped: 31440896 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 31432704 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 31424512 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 31424512 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 31424512 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 31424512 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 31424512 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 31424512 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 31424512 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 31424512 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 31424512 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 31424512 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 31424512 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 31424512 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 31424512 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126935040 unmapped: 31416320 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 31408128 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 31408128 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 31408128 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 31408128 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 31408128 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 31408128 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 31408128 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 31408128 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 31408128 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 31408128 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 31408128 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 31391744 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126967808 unmapped: 31383552 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126967808 unmapped: 31383552 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126967808 unmapped: 31383552 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126967808 unmapped: 31383552 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126967808 unmapped: 31383552 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126967808 unmapped: 31383552 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126967808 unmapped: 31383552 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126967808 unmapped: 31383552 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126967808 unmapped: 31383552 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126967808 unmapped: 31383552 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126976000 unmapped: 31375360 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126976000 unmapped: 31375360 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126976000 unmapped: 31375360 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126976000 unmapped: 31375360 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126976000 unmapped: 31375360 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126976000 unmapped: 31375360 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126976000 unmapped: 31375360 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126976000 unmapped: 31375360 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126976000 unmapped: 31375360 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126976000 unmapped: 31375360 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126976000 unmapped: 31375360 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126976000 unmapped: 31375360 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126976000 unmapped: 31375360 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126976000 unmapped: 31375360 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 31367168 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 31367168 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 31367168 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 31367168 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 31367168 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 31367168 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 31367168 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 31367168 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 31367168 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 31367168 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 31367168 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 31367168 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 31367168 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 31358976 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127000576 unmapped: 31350784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127000576 unmapped: 31350784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127000576 unmapped: 31350784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127000576 unmapped: 31350784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127000576 unmapped: 31350784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127000576 unmapped: 31350784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:30 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127000576 unmapped: 31350784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127000576 unmapped: 31350784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127000576 unmapped: 31350784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127000576 unmapped: 31350784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127000576 unmapped: 31350784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127000576 unmapped: 31350784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127000576 unmapped: 31350784 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127008768 unmapped: 31342592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127008768 unmapped: 31342592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127008768 unmapped: 31342592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127008768 unmapped: 31342592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127008768 unmapped: 31342592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127008768 unmapped: 31342592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127008768 unmapped: 31342592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127008768 unmapped: 31342592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127008768 unmapped: 31342592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127008768 unmapped: 31342592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127008768 unmapped: 31342592 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127016960 unmapped: 31334400 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127016960 unmapped: 31334400 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127016960 unmapped: 31334400 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127016960 unmapped: 31334400 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127016960 unmapped: 31334400 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127016960 unmapped: 31334400 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127016960 unmapped: 31334400 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127016960 unmapped: 31334400 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127016960 unmapped: 31334400 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127016960 unmapped: 31334400 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127016960 unmapped: 31334400 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127025152 unmapped: 31326208 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127033344 unmapped: 31318016 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127049728 unmapped: 31301632 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 127057920 unmapped: 31293440 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: do_command 'config diff' '{prefix=config diff}'
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: do_command 'config show' '{prefix=config show}'
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: do_command 'counter dump' '{prefix=counter dump}'
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156068 data_alloc: 218103808 data_used: 5042176
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126443520 unmapped: 31907840 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: do_command 'counter schema' '{prefix=counter schema}'
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126459904 unmapped: 31891456 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: osd.2 154 heartbeat osd_stat(store_statfs(0x4f96ef000/0x0/0x4ffc00000, data 0x90f64b/0x9cd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: prioritycache tune_memory target: 4294967296 mapped: 126599168 unmapped: 31752192 heap: 158351360 old mem: 2845415832 new mem: 2845415832
Dec  6 05:27:31 np0005548918 ceph-osd[78376]: do_command 'log dump' '{prefix=log dump}'
Dec  6 05:27:31 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  6 05:27:31 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4096137834' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  6 05:27:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:31 np0005548918 nova_compute[229246]: 2025-12-06 10:27:31.444 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:31 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  6 05:27:31 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/544280558' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  6 05:27:31 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:31 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:27:31 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:31.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:27:31 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:31 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:32 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:32 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:32 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:32.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:32 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec  6 05:27:32 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2496513314' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec  6 05:27:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:32 np0005548918 nova_compute[229246]: 2025-12-06 10:27:32.771 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:32 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:32 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Dec  6 05:27:33 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1616872659' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec  6 05:27:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec  6 05:27:33 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3251340584' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec  6 05:27:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec  6 05:27:33 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1045814600' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec  6 05:27:33 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:33 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:33 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:33.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:33 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec  6 05:27:33 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3432369359' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec  6 05:27:33 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:33 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:34 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:34 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:34 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:34.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec  6 05:27:34 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/772093391' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec  6 05:27:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec  6 05:27:34 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2183060927' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec  6 05:27:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec  6 05:27:34 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4024216677' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec  6 05:27:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec  6 05:27:34 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3501769992' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec  6 05:27:34 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:34 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:34 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec  6 05:27:34 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2632608971' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec  6 05:27:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec  6 05:27:35 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/120051130' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec  6 05:27:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec  6 05:27:35 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3652493996' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec  6 05:27:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec  6 05:27:35 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/283759111' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec  6 05:27:35 np0005548918 systemd[1]: Starting Hostname Service...
Dec  6 05:27:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec  6 05:27:35 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4281400625' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec  6 05:27:35 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:35 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:35 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:35.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:35 np0005548918 systemd[1]: Started Hostname Service.
Dec  6 05:27:35 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:35 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:35 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec  6 05:27:35 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3162727959' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec  6 05:27:36 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:36 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  6 05:27:36 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:36.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  6 05:27:36 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec  6 05:27:36 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3200042151' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec  6 05:27:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:36 np0005548918 nova_compute[229246]: 2025-12-06 10:27:36.446 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:36 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec  6 05:27:36 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4236343578' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec  6 05:27:36 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:36 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:36 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec  6 05:27:36 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1546404607' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec  6 05:27:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:37 np0005548918 nova_compute[229246]: 2025-12-06 10:27:37.774 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:37 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:37 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:37 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:37.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:37 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:37 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec  6 05:27:38 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/862610040' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec  6 05:27:38 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:38 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:38 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:38.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Dec  6 05:27:38 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4053778090' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec  6 05:27:38 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:38 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:38 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec  6 05:27:38 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/705246239' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  6 05:27:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:39 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:39 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:39 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:39.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:39 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  6 05:27:39 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  6 05:27:39 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:39 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec  6 05:27:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec  6 05:27:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  6 05:27:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  6 05:27:40 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:40 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:40 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:40.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:40 np0005548918 podman[257341]: 2025-12-06 10:27:40.223512122 +0000 UTC m=+0.097824663 container health_status 0edec535c860b5190cb84b5353a2b60172ab51f8cf3b13fe2e103aefb76aff20 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec  6 05:27:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec  6 05:27:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec  6 05:27:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  6 05:27:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  6 05:27:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  6 05:27:40 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Dec  6 05:27:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1960310445' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec  6 05:27:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec  6 05:27:40 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec  6 05:27:40 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:40 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:41 np0005548918 nova_compute[229246]: 2025-12-06 10:27:41.449 229250 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  6 05:27:41 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec  6 05:27:41 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1936562756' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec  6 05:27:41 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:41 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  6 05:27:41 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:41.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  6 05:27:41 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-2-whsrlg[85439]: Sat Dec  6 10:27:41 2025: (VI_0) received an invalid passwd!
Dec  6 05:27:42 np0005548918 radosgw[83463]: ====== starting new request req=0x7f87e52fe5d0 =====
Dec  6 05:27:42 np0005548918 radosgw[83463]: ====== req done req=0x7f87e52fe5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  6 05:27:42 np0005548918 radosgw[83463]: beast: 0x7f87e52fe5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:42.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  6 05:27:42 np0005548918 ceph-mon[75798]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Dec  6 05:27:42 np0005548918 ceph-mon[75798]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2590202385' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec  6 05:27:42 np0005548918 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-rgw-default-compute-2-yurwwh[86019]: Sat Dec  6 10:27:42 2025: (VI_0) received an invalid passwd!
